🧐 3 main sensors that make the car "see"
Self-driving cars need accurate "perception" of their surroundings to process and make decisions. This system relies on the collaboration of three main sensors to provide comprehensive and reliable data in all situations.
| sensor | Working principle | Highlights | Limitations |
| LiDAR | L ight D etection and R anging: Shoot a pulsed laser light and measure the time it takes for the light to return (Time of Flight). | Create highly accurate 3D maps of your environment, both in shape and distance (far-field). | It is expensive, performance may decrease in bad weather conditions (e.g. fog, heavy rain), and the high power laser signal may affect other camera sensors. |
| Radar | Radio Detection and Ranging : Emits radio waves and receives the reflected signals . | Excellent detection of distance and relative speed of objects and resistant to weather conditions (rain, snow, fog). | Lower resolution than LiDAR, unable to identify the shape of objects in detail. |
| Camera | Optical Cameras : Record images and videos. | Uses Computer Vision technology to recognize objects (cars, pedestrians), read traffic signs , detect traffic lights and lane markings (providing color and texture information). | Performance depends on lighting conditions (does not work well in dark or bright places), cannot measure distance as accurately as LiDAR or Radar. |
🎯 Sensor Fusion
In autonomous driving systems, data from three types of sensors is combined and processed through a technique called "Sensor Fusion," which is most important because:
Compensate for weaknesses: When one sensor is not performing well (such as a camera at night or LiDAR in bad weather), another sensor can help confirm the information, such as radar that is still working well.
Increased Accuracy: Combining accurate distance data (LiDAR/Radar) with object classification data (Camera) allows the system to pinpoint locations and understand the environment more completely.
Increased safety: The system always has multiple backups of data, enabling safe and reliable driving decisions in all situations.
In short, these sensors act as the car's eyes and ears. LiDAR creates accurate 3D maps, radar detects speed and distance even in bad weather, and cameras understand details of the outside world like colors and road signs.
| Core technology | Automotive Sensors, Self-Driving Car Technology, ADAS, Autonomous Vehicle, Sensor Fusion |
| Sensor name | LiDAR, Radar, Camera, LiDAR sensor, Radar sensor, Car camera |
| Working | Working principle, environmental perception, distance measurement, 3D map creation, object classification |
| Industry/Target Group | Electric vehicles, EVs, smart vehicles, automotive innovation, safety technology |
Figure 1: Overview of the sensors working together.
This image shows a concept of how all three sensors on a self-driving car work together, with lines showing the detections of each sensor.
Text in image: SENSOR FUSION, PERCEPTION & DECISION
Figure 2: LiDAR - Creating accurate 3D maps.
This image focuses on LiDAR, showing a car with a LiDAR on its roof and a laser beam scanning its surroundings to create a 3D point cloud.
Text in image: LiDAR: PRECISION 3D MAPPING, PULSED LASER BEAMS, HIGH-DEFINITION POINT CLOUD
Figure 3: Radar - Distance and speed detection in all weather conditions.
This image shows a radar that emits radio waves and reflects them back, demonstrating its ability to penetrate bad weather conditions such as rain or fog.