Automotive
Tina Jeffrey |
Governments around the world, in particular those of the United States and the European Union, are calling for the standardization of ADAS features. Meanwhile, consumers are demonstrating a readiness to adopt these systems to make their driving experience safer. In fact, vehicle safety rating systems are becoming a vital ‘go to’ information resource for new car buyers. Take, for example, the European New Car Assessment Programme Advanced (Euro NCAP Advanced). This organization publishes safety ratings on cars that employ technologies with scientifically proven safety benefits for drivers. The emergence of these ratings encourages automakers to exceed minimum statutory requirements for new cars.
Sizing the ADAS market
ABI Research claims that the global ADAS market, estimated at US$16.6 billion at the end of 2012, will grow to more than US$260 billion by the end of 2020, representing a CAGR of 41%. Which means that cars will ship with more of the following types of safety-certified systems:
The 10 challenges
So what are the challenges that ADAS suppliers face when bringing systems to market? Here, in my opinion, are the top 10:
- Safety must be embedded in the culture of every organization in the supply chain. ADAS suppliers can't treat safety as an afterthought that is tacked on at the end of development; rather, they must embed it into their development practices, processes, and corporate culture. To comply with ISO 26262, an ADAS supplier must establish procedures associated with safety standards, such as design guidelines, coding standards and reviews, and impact analysis procedures. It must also implement processes to assure accountability and traceability for decisions. These processes provide appropriate checks and balances and allow for safety and quality issues to be addressed as early as possible in the development cycle.
- ADAS systems are a collaborative effort. Most ADAS systems must integrate intellectual properties from a number of technology partners; they are too complex to be developed in isolation by a single supplier. Also, in a safety-certified ADAS system, every component must be certified — from the underlying hardware (be it a multi-core processor, GPU, FPGA, or DSP) to the OS, middleware, algorithms, and application code. As for the application code, it must be certified to the appropriate automotive safety integrity level; the level for the ADAS applications listed above is typically ASIL D, the highest level of ISO 26262 certification.
- Systems may need to comply with multiple industry guidelines or specifications. Besides ISO 26262, ADAS systems may need to comply with additional criteria, as dictated by the tier one supplier or automaker. On the software side, these criteria may include AUTOSAR or MISRA. On the hardware side, they will include AEC-Q100 qualification, which involves reliability testing of auto-grade ICs at various temperature grades. ICs must function reliably over temperature ranges that span -40 degrees C to 150 degrees C, depending on the system.
- ADAS development costs are high. These systems are expensive to build. To achieve economies of scale, they must be targeted at mid- and low-end vehicle segments. Prices will then decline as volume grows and development costs are amortized, enabling more widespread adoption.
- The industry lacks interoperability specifications for radar, laser, and video data in the car network. For audio-video data alone, automakers use multiple data communication standards, including MOST (media-oriented system transport), Ethernet AVB, and LVDS. As such, systems must support a multitude of interfaces to ensure adoption across a broad spectrum of possible interfaces. Also, systems may need additional interfaces to support radar or lidar data.
- The industry lacks standards for embedded vision-processing algorithms. Ask 5 different developers to develop a lane departure warning system and you’ll get 5 different solutions. Each solution will likely start with a Matlab implementation that is ported to run on the selected hardware. If the developer is fortunate, the silicon will support image processing primitives (a library of functions designed for use with the hardware) to accelerate development. TI, for instance, has a set of image and video processing libraries (IMGLIB and VLIB) optimized for their silicon. These libraries serve as building blocks for embedded vision processing applications. For instance, IMGLIB has edge detection functions that could be used in a lane departure warning application.
- Data acquisition and data processing for vision-based systems is high-bandwidth and computationally intensive. Vision-based ADAS systems present their own set of technical challenges. Different systems require different image sensors operating at different resolutions, frame rates, and lighting conditions. A system that performs high-speed forward-facing driver assistance functions such as road sign detection, lane departure warning, and autonomous emergency breaking must support a higher frame rate and resolution than a rear-view camera that performs obstacle detection. (A rear-view camera typically operates at low speeds, and obstacles in the field of view are in close proximity to the vehicle.) Compared to the rear-view camera, an LDW, AEB, or RSD system must acquire and process more incoming data at a faster incoming frame rate, before signaling the driver of an unintentional lane drift or warning the driver that the vehicle is exceeding the posted speed limit.
- ADAS cannot add to driver distraction. There is an increase in the complexity of in-vehicle tasks and displays that can result in driver information overload. Systems are becoming more integrated and are presenting more data to the driver. Information overload could result in high cognitive workload, reducing situational awareness and countering the efficacy of ADAS. Systems must therefore be easy to use and should make use of the most appropriate modalities (visual, manual, tactile, sound, haptic, etc.) and be designed to encourage driver adoption. Development teams must establish a clear specification of the driver-vehicle interface early on in development to ensure user and system requirements are aligned.
- Environmental factors affect ADAS. ADAS systems must function under a variety of weather and lighting conditions. Ideally, vision-based systems should be smart enough to understand when they are operating in poor visibility scenarios such as heavy fog or snow, or when direct sunlight shines into the lens. If the system detects that the lens is occluded or that the lighting conditions are unfavorable, it can disable itself and warn the driver that it is non-operational. Another example is an ultrasonic parking sensor that becomes prone to false positives when encrusted with mud. Combining the results of different sensors or different sensor technologies (sensor fusion) can often provide a more effective solution than using a single technology in isolation.
- Testing and validating is an enormous undertaking. Arguably, testing and validation is the most challenging aspect of ADAS development, especially when it comes to vision systems. Prior to deploying a commercial vision system, an ADAS development team must amass hundreds if not thousands of hours of video clips in a regression test database, in an effort to test all scenarios. The ultimate goal is to achieve 100% accuracy and zero false positives under all possible conditions: traffic, weather, number of obstacles or pedestrians in the scene, etc. But how can the team be sure that the test database comprises all test cases? The reality is that they cannot — which is why suppliers spend years testing and validating systems, and performing extensive real-world field-trials in various geographies, prior to commercial deployment.