Kaivan KarimiSVP of Strategy and Business DevelopmentBlackBerry Technology Solutions (BTS)
Advanced driver assistance systems (ADAS) is one of the most important of the many different technologies going into the connected autonomous car of the future. ADAS is evolving from discrete single function systems, such as blind spot monitoring and lane departure warning to integrated active safety systems and automated driving. With ADAS high performance computing is intersecting with the need for functional safety, changing the very nature of the hardware and software in these next generation systems. So, a flexible, safe and stable software environment that leverages the performance advances in silicon while maintaining ISO 26262 functional safety certification is critical. In an ADAS based car, software is the nervous system that works with the brains of the operations, which are Microprocessors/Microcontrollers. Together they work seamlessly with a range of other hardware components, some of which are noted below.
Radar Systems
Radar technology collects information around the vehicle and feeds it to the ADAS’s domain controller managing sensor fusion. Several subsystems are part of the package such as a 77 GHz radar system that enables high precision and scalability from short to mid to long range detection; 24 GHz radar for high-demand features, such as rear cross traffic alert or blind spot detection; and Light Detection and Range (LIDAR) for adaptive cruise control, accident avoidance and mitigation and object detection. LIDAR is like a light-based radar that sends out short pulses of invisible scanning laser light, and based on how long it takes to see the reflection, calculates how far away it is. It then creates a 3-D image of the surroundings of the car with high accuracy.
Vision Processing
A range of cameras and sensors combine to see the world. External cameras assist with lane departure warnings, forward collision warnings, traffic sign recognition, and pedestrian recognition. Internal cameras provide information related to the driver’s focal point and behavior so that the ADAS system can react accordingly. These can be augmented with 3D capabilities that enable new HMI user experiences, such as gesture recognition and control of cabin button functions, or infotainment systems. Ultrasound is also used for close-end object detection and will be used in park-assist applications, where a typical car would have between 10 to twelve sensors.
GPS
Global Positioning Systems are satellite-based navigation systems using a network of 24 satellites that were put in orbit by the U.S. Department of Defense (DoD) for military applications. In the 1980s the US government made the system available for civilian use. Galileo (EU), GLONAS (Russia), BeiDou (China), and IRNSS (India) are examples of other satellite-based navigation systems being developed around the world. They have accuracy of within 10 to 50 feet for 95% of the time, with most providing a worst case pseudo-accuracy of 7.8 meters at a 95% confidence level. The actual accuracy depends on factors such as atmospheric effects, line of site clearance to the satellites, and receiver quality. To improve GPS location accuracy to centimeter-level accuracy, systems make use of ground-based reference points in combination to the satellite signal. These types of systems are called “differential GPS,” and a great example is that which comes from rental car companies.
For a self-driving car you need to know which lane a car is in and where within that lane is in reference to other cars and structures surrounding it, and all of this must be updated at high rates in real-time. This requires computational intensity as well as augmented GPS functionality with accelerometers, altimeters, gyroscopes, and a tachometer/odometer to achieve finer measurements of the position of the car under various conditions.
V2X
V2X communication refers to the exchange of information from a vehicle to anything that may affect the vehicle, and vice versa. V2X stands for Vehicle-to-Vehicle (V2V), Vehicle-to-Infrastructure (V2I), Vehicle-to-Pedestrian (V2P), Vehicle-to-device (V2D), Vehicle-to-grid (V2G) and for all practical purposes, Vehicle-to-Everything. (You can see that the “X” is the catch-all variable.)
V2X is considered a cooperative approach between cars and their environment to make a more effective means to avoiding accidents and traffic congestion. For V2X to really work it needs to be rolled out with adoption rates of greater than 95%, and from that perspective it may be a few years before the infrastructure is put in place. The communication technology most often talked about for V2X is based on Dedicated Short Range Communication (DSRC) operating at the 5.9 GHz frequency based on 802.11p
Wireless Access for Vehicular Environments (WAVE). The architecture, message protocols, and security standards are based on IEEE 1609.x in the US and various of ETSI layer standards. Note that cryptographic security must be built in so that the signal sent and received can be trusted. False or corrupted signals can produce dire results.
V2X will establish a hybrid access network and enable the flow of information regarding traffic delays and hazard warnings (e.g. road flooding, electrical poles down, or even cars driving in the wrong direction, and others) in a real time manner.
Telematics
A telematics system mixes the functionalities of telecommunications and informatics for a car, and a good way to explain the range of functionalities in a telematics system is to take a closer look those supported by OnStar from General Motors. OnStar includes a cellular modem, GPS, connections to a variety of sensors (some of which are dedicated to reporting significant crashes), a backup battery, and a roof mounted antenna with a range that is better than a typical cellphone.
The box itself gets a “black-box” treatment, and is mounted in the back of the car to shield it from most crashes. The system is connected to a call center, which in turn can report accidents to a public safety answering point such as a 911 operator, and contact garages if only simple towing services or mechanical help is needed. After any incident, the call center operator contacts the passengers of the car, getting more information and assuring them that help is on the way. Emergency and roadside assistance along with basic vehicle diagnostics are the most popular services for most Telematic systems.
Over time, a host of other services have been added from weather reports and sports scores, to traffic information, geo-fencing, and stolen vehicle tracking. The list of automakers who already offer telematics services include GM, Chrysler, Ford, Lincoln, Audi, BMW, Mercedes-Benz, Volkswagen, Porsche, Jaguar, Rolls-Royce, Volvo, Mini, Toyota, Infiniti, Lexus, Mazda, Nissan, and Subaru.
Domain Controllers and Micro Processor/ Micro Controller Units (MPUs/MCUs)
MCUs and MPUs are the physical hardware brains of the whole vehicle operation, and combined with powerful sensor fusion algorithms are what turn a car into a robot. With the number of sensors feeding situational awareness data in real-time, one can see that high-speed and high-bandwidth data processing are at the heart of automated driving. High performance Electronic Control Units (ECUs) accept the sensor inputs that monitor the automobile’s constantly changing environment, and fuse those data at speeds of greater that 1Gb/sec to make safe decisions. As the table shows, this will ultimately shift the burden of “situational awareness and response” from the driver to the car. High-speed decision making at real-time speed, dealing with the amount of data discussed above, requires secure, reliable, and very fast processing computers.
The growth of electronics in cars has resulted in double-digit growth of the number of ECUs being used in all car segments. Today’s embedded vehicle functions are shared between up to 100 ECUs and are connected over several buses, and typically uses 6-8 operating systems. This decentralized system has drawbacks in increased complexity, weight, and overall cost of the vehicle. The trend now is to move from 80-100 decentralized ECUs scattered across the vehicle, to 8 to 12 domains with their respective mega-ECUs, or Domain Controllers, which among many other things reduces the complexity of the system.
The next blog will address the software architectural issues to be considered when creating connected autonomous car of the future. For more see the QNX web site.
_______________________________________________________________________________
Kaivan Karimi is the SVP of Strategy and Business Development at BlackBerry Technology Solutions (BTS). His responsibilities include operationalizing growth strategies, product marketing and business development, eco-system enablement, and execution of business priorities. He has been an IoT evangelist since 2010, bringing more than two decades of experience working in cellular, connectivity, networking, sensors, and microcontroller semiconductor markets. Kaivan holds graduate degrees in engineering (MSEE) and business (MBA). Prior to joining BlackBerry, he was the VP and General Manager of Atmel wireless MCUs and IOT business unit.