Automotive
While self-driving vehicles are gradually becoming a reality, more and more of today’s cars roll out from factories featuring advanced driver assistance systems (ADAS). We are quickly getting used to adaptive cruise control, blind spot monitoring, parking assistance, lane departure warning, and many other features that make driving safer and the driver’s job easier. Data from cameras, sensors, and V2X infrastructure feed into ADAS systems, increasing their accuracy and efficiency. These systems are important steps toward fully autonomous driving, but the ultimate responsibility for decision making still lies with a driver.
The more that cars become connected, the more the average driver can be bombarded by information while driving. “In 500 feet make a right turn.” “You have an incoming call from Christine.” “You have a new message on Facebook.” “You are over the speed limit.” This may not be so big of a distraction under normal conditions. But sometimes, when driving in hectic city traffic or in a snow storm, it is critical to keep eyes on the road, while still receiving essential information. The good news is, the technology is already there to remedy this.
Heads up for HUDs
Keeping the driver’s eyes on the road is a priority, and head-up displays (HUDs) can accomplish just that. They project alerts and navigation prompts right on the windshield. Analysts predict an explosive growth of HUDs with the market reaching close to US$100 billion by 2020. The bulk of HUDs are relatively simple combiners, but more advances in wide-field-of-view HUDs are coming soon.
Projecting alerts and navigation prompts directly on the windshield. |
Computer vision, also known as machine vision, is a key to processing the endless flow of data. With its human-like image recognition ability, computer vision processes road scenes, and the system fuses data from multiple sources. Add in a natural representation of processing outcomes in the form of augmented reality, while tracking driver’s pupils, and you have a completely new level of driver’s experience — safe and intuitive.
Next-generation driving experience
At Luxoft, we’ve been working on making this experience a reality. The result is CVNAR, a computer vision and augmented reality solution. CVNAR is a powerful software framework containing mathematical algorithms that process a vast amount of road data in real time to generate intuitive prompts and alerts. CVNAR has built-in algorithms for road and pedestrian detection, vehicle recognition and tracking, lane detection, facade recognition and texture extraction, road sign recognition, and parking space search. It performs relative and absolute positioning and easily integrates with navigation, the map database, sensors, and other data sources. A unique feature of CVNAR is its extrapolation engine for latency avoidance.
Detecting and recognizing road signs, pedestrians, traffic lanes, gas stations, and other objects. |
Alerting the driver to an empty parking spot. |
Era of a software-defined car
A modern car runs on code as much as it runs on gasoline (or a battery-powered electric motor). Today, it takes over 100 million lines of software code to get a premium car going, and the amount of software necessary keeps expanding. At Luxoft, we are excited about the car’s digital future, and we work every day to help bring it about, by developing cutting-edge automotive solutions for leading global vehicle manufacturers.
Offering a wide range of embedded software development and integration services for in-vehicle infotainment and telematics systems, digital instrument clusters, and head-up displays, Luxoft has developed User Experience (UX) and Human Machine Interface (HMI) technology for millions of vehicles on the road today. We push the envelope of technology in such areas as situation-aware HMI, computer vision and augmented reality, while Luxoft’s products, the Populus and Teora UX and HMI design tool chains, power the development of award-winning automotive HMIs and slash time to market.
Software holds the key to the future of cars. It is essential to creating a customized user experience in vehicles. With over-the-air updates, software offers unmatched flexibility and scalability. Finally, it takes safety to the next level with its ability to simulate human-like logic through complex algorithms.
You can view Luxoft’s CVNAR solution running on a QNX-based ADAS demo this week at CES, in the BlackBerry booth: LVCC North Hall, #325.
About Alex
Alex Leonov has been in the automotive and IT industry for over 18 years in various business development and marketing roles. Currently, Alex leads the global marketing efforts of Luxoft Automotive.