Showing posts with label Autonomous cars. Show all posts
Showing posts with label Autonomous cars. Show all posts

Autonomous cars that can navigate winter roads? ‘Snow problem!

A look at what happens when you equip a Ford Fusion with sensor fusion.

Paul Leroux
Lets face it, cars and snow don’t mix. A heavy snowfall can tax the abilities of even the best driver — not to mention the best automated driving algorithm. As I discussed a few months ago, snow can mask lane markers, obscure street signs, and block light-detection sensors, making it difficult for an autonomous car to determine where it should go and what it should do. Snow can even trick the car into “seeing” phantom objects.

Automakers, of course, are working on the problem. Case in point: Ford’s autonomous research vehicles. These experimental Ford Fusion sedans create 3D maps of roads and surrounding infrastructure when the weather is good and visibility clear. They then use the maps to position themselves when the road subsequently disappears under a blanket of the white stuff.

How accurate are the maps? According to Ford, the vehicles can position themselves to within a centimeter of their actual location. Compare that to GPS, which is accurate to about 10 yards (9 meters).

To create the maps, the cars use LiDAR scanners. These devices collect a ginormous volume of data about the road and surrounding landmarks, including signs, buildings, and trees. Did I say ginormous? Sorry, I meant gimongous: 600 gigabytes per hour. The scanners generate so many laser points — 2.8 million per second — that some can bounce off falling snowflakes or raindrops, creating the false impression that an object is in the way. To eliminate these false positives, Ford worked with U of Michigan researchers to create an algorithm that filters out snow and rain.

The cars don’t rely solely on LiDAR. They also use cameras and radar, and blend the data from all three sensor types in a process known as sensor fusion. This “fused” approach compensates for the shortcomings of any particular sensor technology, allowing the car to interpret its environment with greater certainty. (To learn more about sensor fusion for autonomous cars, check out this recent EE Times Automotive article from Hannes Estl of TI.)

Ford claims to be the first automaker to demonstrate robot cars driving in the snow. But it certainly won’t be the last. To gain worldwide acceptance, robot cars will have to prove themselves on winter roads, so we are sure to see more innovation on this (cold) front. ;-)

In the meantime, dim the lights and watch this short video of Ford’s “snowtonomy” technology:



Did you know? In January, QNX announced a new software platform for ADAS and automated driving systems, including sensor fusion solutions that combine data from multiple sources such as cameras and radar processors. Learn more about the platform here and here.

“I don’t know where I’m going from here, but I promise it won’t be boring”

Patryk Fournier
The quote is from the now late but great David Bowie and is extremely prophetic when you apply it to autonomous driving. Autonomous driving is very much still uncharted territory. Investments in roadway infrastructures are being made, consumer acceptance is trending positive, and, judging by the news and excitement from CES 2016, the future if anything will not be boring.

CES 2016 stretched into the weekend this year and ICYMI there was a lot of compelling media coverage of QNX and BlackBerry. Here’s a roundup of the most interesting coverage from the weekend:

ARS Technica: QNX demos new acoustic and ADAS technologies
The crew from ARSTechnica filmed a terrific demonstration of the QNX Acoustics Management Platform and the QNX Platform for ADAS. The demonstration highlights the power and versatility of the acoustics platform, including the QNX In-Car Communication module, which allows the driver to effortlessly speak to passengers in the back of the vehicle, over the roar of an engine revving at high speed. The demonstration also showcases how the QNX OS can support augmented reality and heads-up displays:

Huffington Post: CES 2016 Proves The Future Of Driverless Cars Is Promising
Huffington Post highlighted BlackBerry and QNX as key newsmakers for advancements in driverless cars. The article notes QNX’s automotive leadership: “The software is actually installed in 50 per cent of the world’s automotive infotainment systems including Audi, Volkswagen, Ford, GM and Chrysler.”

Crackberry: Inside the QNX Toyota Highlander at CES 2016
The folks at CrackBerry filmed a demonstration of our latest technology concept vehicle, based on a Toyota Highlander. The demo focuses on the QNX In-Car Communication acoustics module, which forms part of the recently launched QNX Acoustics Management Platform:



HERE 360: QNX and HERE bring to life a multi-screen experience in vehicles
A blog post from our ecosystem partner mentions HERE navigation and its use in the Toyota Highlander and Jeep Wrangler technology concept vehicles.

Why is software the key to bringing augmented reality to cars?

Guest post by Alex Leonov, marketing director, Luxoft Automotive.

While self-driving vehicles are gradually becoming a reality, more and more of today’s cars roll out from factories featuring advanced driver assistance systems (ADAS). We are quickly getting used to adaptive cruise control, blind spot monitoring, parking assistance, lane departure warning, and many other features that make driving safer and the driver’s job easier. Data from cameras, sensors, and V2X infrastructure feed into ADAS systems, increasing their accuracy and efficiency. These systems are important steps toward fully autonomous driving, but the ultimate responsibility for decision making still lies with a driver.

The more that cars become connected, the more the average driver can be bombarded by information while driving. “In 500 feet make a right turn.” “You have an incoming call from Christine.” “You have a new message on Facebook.” “You are over the speed limit.” This may not be so big of a distraction under normal conditions. But sometimes, when driving in hectic city traffic or in a snow storm, it is critical to keep eyes on the road, while still receiving essential information. The good news is, the technology is already there to remedy this.

Heads up for HUDs
Keeping the driver’s eyes on the road is a priority, and head-up displays (HUDs) can accomplish just that. They project alerts and navigation prompts right on the windshield. Analysts predict an explosive growth of HUDs with the market reaching close to US$100 billion by 2020. The bulk of HUDs are relatively simple combiners, but more advances in wide-field-of-view HUDs are coming soon.

Projecting alerts and navigation prompts directly on the windshield.
HUDs are perfect for presenting information in a convenient, natural way, and giving the driver a feeling of being in control. But HUDs are only as good as the information they display. That is why it is critical to have solid and reliable data processing and decision-making algorithms, running on a reliable OS, that can prioritize and filter data. The resulting alerts and prompts must be communicated to a driver in a clear, transparent way.

Computer vision, also known as machine vision, is a key to processing the endless flow of data. With its human-like image recognition ability, computer vision processes road scenes, and the system fuses data from multiple sources. Add in a natural representation of processing outcomes in the form of augmented reality, while tracking driver’s pupils, and you have a completely new level of driver’s experience — safe and intuitive.

Next-generation driving experience
At Luxoft, we’ve been working on making this experience a reality. The result is CVNAR, a computer vision and augmented reality solution. CVNAR is a powerful software framework containing mathematical algorithms that process a vast amount of road data in real time to generate intuitive prompts and alerts. CVNAR has built-in algorithms for road and pedestrian detection, vehicle recognition and tracking, lane detection, facade recognition and texture extraction, road sign recognition, and parking space search. It performs relative and absolute positioning and easily integrates with navigation, the map database, sensors, and other data sources. A unique feature of CVNAR is its extrapolation engine for latency avoidance.

Detecting and recognizing road signs, pedestrians, traffic lanes, gas stations, and other objects.
CVNAR works perfectly with LCD displays and smartglasses, but it is ultimately built for HUDs. Data from cameras, sensors, CAN, and navigation maps are fused and processed to create an extendable metadata output that describes all augmented objects. It takes a HUD and an eye-tracking camera to implement CVNAR in a vehicle. CVNAR will track the driver’s gaze and adjust the position of the augmented objects in the driver’s line of sight to make sure they don’t obstruct anything important — all in real time.

Alerting the driver to an empty parking spot.
This is not all that CVNAR can do. New car models come packed with infotainment features that take time to learn and memorize. The CVNAR-based smartphone app can help. It turns your smartphone into an interactive guide. Point your phone camera to your dashboard and use augmented prompts to find out more about a particular car function. It can work under the hood, too.

Era of a software-defined car
A modern car runs on code as much as it runs on gasoline (or a battery-powered electric motor). Today, it takes over 100 million lines of software code to get a premium car going, and the amount of software necessary keeps expanding. At Luxoft, we are excited about the car’s digital future, and we work every day to help bring it about, by developing cutting-edge automotive solutions for leading global vehicle manufacturers.

Offering a wide range of embedded software development and integration services for in-vehicle infotainment and telematics systems, digital instrument clusters, and head-up displays, Luxoft has developed User Experience (UX) and Human Machine Interface (HMI) technology for millions of vehicles on the road today. We push the envelope of technology in such areas as situation-aware HMI, computer vision and augmented reality, while Luxoft’s products, the Populus and Teora UX and HMI design tool chains, power the development of award-winning automotive HMIs and slash time to market.

Software holds the key to the future of cars. It is essential to creating a customized user experience in vehicles. With over-the-air updates, software offers unmatched flexibility and scalability. Finally, it takes safety to the next level with its ability to simulate human-like logic through complex algorithms.

You can view Luxoft’s CVNAR solution running on a QNX-based ADAS demo this week at CES, in the BlackBerry booth: LVCC North Hall, #325.



About Alex
Alex Leonov has been in the automotive and IT industry for over 18 years in various business development and marketing roles. Currently, Alex leads the global marketing efforts of Luxoft Automotive.

Video: Paving the way to an autonomous future

Lynn Gayowski
Lynn Gayowski
CES 2016 is now underway, and our kickoff to the year wouldn’t be complete without a behind-the-scenes look at the making of our new technology concept vehicle and updated reference vehicle.

The video below follows the journey of building our vehicles for CES 2016 and highlights the technologies we’re using to speed progress towards automated driving — and the list of tech that QNX covers is impressive! It includes advanced driver assistance systems (ADAS), V2X, and augmented reality, not to mention digital instrument clusters, in-car communication, and infotainment:



QNX Software Systems continues to innovate in automotive, with a vision for the evolution of automated driving and a trusted foundation for building reliable, adaptable systems. At risk of giving away the big finale, I think John Wall, head of QNX, sums up perfectly what QNX is on target for in the automotive industry: “We will dominate the cockpit of the car.” It’s a bold statement but we’re already amassing some imposing stats that back this up:

QNX announces new platforms for automated driving systems and in-car acoustics

Paul Leroux
Every year, at CES, QNX Software Systems showcases its immense range of solutions for infotainment systems, digital instrument clusters, telematics systems, advanced driving assistance systems (ADAS), and in-car acoustics. This year is no different. Well, actually… let me take that back. Because this year, we are also announcing two new and very important software platforms: one that can speed the development of automated driving systems, and one that can transform how acoustics applications are implemented in the car.

QNX Platform for ADAS
The automotive industry is at an inflection point, with autonomous and semiautonomous vehicles moving from theory to reality. The new QNX Platform for ADAS is designed to help drive this industry transformation. Based on our deep automotive experience and 30-year history in safety-critical systems, the platform can help automotive companies reduce the time and effort of building a full range of ADAS and automated driving applications:
  • from informational ADAS systems that provide a multi-camera, 360° surround view of the vehicle…
  • to sensor fusion systems that combine data from multiple sources such as cameras and radar…
  • to advanced high-performance systems that make control decisions in fully autonomous vehicles



Highlights of the platform include:
  • The QNX OS for Safety, a highly reliable OS pre-certified at all of the automotive safety integrity levels needed for automated driving systems.
  • An OS architecture that can simplify the integration of new sensor technologies and purpose-built ADAS processors.
  • Frameworks and reference implementations to speed the development of multi-camera vision systems and V2X applications (vehicle-to-vehicle and vehicle-to-infrastructure communications).
  • Pre-integrated partner technologies, including systems-on-chip (SoCs), vision algorithms, and V2X modules, to enable faster time-to-market for customers.

This week, at CES 2016, QNX will present several ADAS and V2X demonstrations, including:
  • Demos that show how QNX-based ADAS systems can perform realtime analysis of complex traffic scenarios to enhance driver awareness or enable various levels of automated driving.
  • QNX-based V2X technology that allows cars to “talk” to each other and to traffic infrastructure (e.g. traffic lights) to prevent collisions and improve traffic flow.

To learn more, check out the ADAS platform press release, as well as the press release that provides a full overview of our many CES demos — including, of course, the latest QNX technology concept vehicle!

QNX Acoustics Management Platform
It’s a lesser-known fact, but QNX is a leader in automotive acoustics — its software for handsfree voice communications has shipped in over 40 million automotive systems worldwide. This week, QNX is demonstrating once again why it is a leader in this space, with a new, holistic approach to managing acoustics in the car, the QNX Acoustics Management Platform (AMP):

  • Enables automakers to enhance the audio and acoustic experience for drivers and passengers, while reducing system costs and complexity.
  • Replaces the traditional piecemeal approach to in-car acoustics with a unified model: automakers can now manage all aspects of in-car acoustics efficiently and holistically, for easier integration and tuning, and for faster time-to-production.
  • Reduces hardware costs with a new, low-latency audio architecture that eliminates the need for dedicated digital signal processors or specialized external hardware.
  • Integrates a full suite of acoustics modules, including QNX Acoustics for Voice (for handsfree systems), QNX Acoustics for Engine Sound Enhancement, and the brand new QNX In-Car Communication (ICC).

For anyone who has struggled to hold a conversation in a car at highway speeds, QNX ICC enhances the voice of the driver and relays it to loudspeakers in the back of the vehicle. Instead of shouting or having to turn around to be heard, the driver can talk normally while keeping his or her eyes on the road. QNX will demonstrate ICC this week at CES, in its latest technology concept car, based on a Toyota Highlander.

Read the press release to learn more about QNX AMP.



Bringing a bird’s eye view to a car near you

QNX and TI team up to enable surround-view systems in mass-volume vehicles

Paul Leroux
Uh-oh. You are 10 minutes late for your appointment and can’t find a place to park. At long last, a space opens up, but sure enough, it’s the parking spot from hell: cramped, hard to access, with almost no room to maneuver.

Fortunately, you’ve got this covered. You push a button on your steering wheel, and out pops a camera drone from the car’s trunk. The drone rises a few feet and begins to transmit a bird’s eye view of your car to the dashboard display — you can now see at a glance whether you are about to bump into curbs, cars, concrete barriers, or anything else standing between you and parking nirvana. Seconds later, you have backed perfectly into the spot and are off to your meeting.

Okay, that’s the fantasy. In reality, cars with dedicated camera drones will be a long time coming. In the meantime, we have something just as good and a lot more practicable — an ADAS application called surround view.

Getting aligned
Approaching an old problem from a
new perspective
. Credit: TI
Surround-view systems typically use four to six fisheye cameras installed at the front, back, and sides of the vehicle. Together, these cameras capture a complete view of the area around your car, but there’s a catch: the video frames they generate are highly distorted. So, to start, the surround-view system performs geometric alignment of every frame. Which is to say, it irons all the curves out.

Next, the system stitches the corrected video frames into a single bird’s eye view. Mind you, this step isn’t simply a matter of aligning pixels from several overlapping frames. Because each camera points in a different direction, each will generate video with unique color balance and brightness levels. Consequently, the system must perform photometric alignment of the image. In other words, it corrects these mismatches to make the resulting output look as if it were taken by a single camera hovering over the vehicle.

Moving down-market
If you think that all this work takes serious compute power, you’re right. The real trick, though, is to make the system affordable so that luxury car owners aren’t the only ones who can benefit from surround view.

Which brings me to QNX Software Systems’ support for TI’s new TDA2Eco system-on-chip (SoC), which is optimized for 3D surround view and park-assist applications. The TDA2Eco integrates a variety of automotive peripherals, including CAN and Gigabit Ethernet AVB, and supports up to eight cameras through parallel, serial and CSI-2 interfaces. To enable 3D viewing, the TDA2Eco includes an image processing accelerator for decoding multiple camera streams, along with graphics accelerators for rendering virtual views.

Naturally, surround view also needs software, which is where the QNX OS for Safety comes in. The OS can play several roles in surround-view systems, such as handling camera input, hosting device drivers for camera panning and control, and rendering the processed video onto the display screen, using QNX Software Systems’ high-performance Screen windowing system. The QNX OS for Safety complies with the ISO 26262 automotive functional safety standard and has a proven history in safety-critical systems, making it ideally suited for collision warning, surround view, and a variety of other ADAS applications.

Okay, enough from me. Let’s look at a video, hosted by TI’s Gaurav Agarwal, to see how the TDAx product line can support surround-view applications:



For more information on the TDAx product line, visit the TI website; for more on the QNX OS for Safety, visit the QNX website.

Five reasons why they should test autonomous cars in Ontario

Did I say five? I meant six…

Paul Leroux
It was late and I needed to get home. So I shut down my laptop, bundled myself in a warm jacket, and headed out to the QNX parking lot. A heavy snow had started to fall, making the roads slippery — but was I worried? Not really. In Ottawa, snow is a fact of life. You learn to live with it, and you learn to drive in it. So I cleared off the car windows, hopped in, and drove off.

Alas, my lack of concern was short-lived. The further I drove, the faster and thicker the snow fell. And then, it really started to come down. Pretty soon, all I could see out my windshield was a scene that looked like this, but with even less detail:



That’s right: a pure, unadulterated whiteout. Was I worried? Nope. But only because I was in a state of absolute terror. Fortunately, I could see the faintest wisp of tire tracks immediately in front of my car, so I followed them, praying that they didn’t lead into a ditch, or worse. (Spoiler alert: I made it home safe and sound.)

Of course, it doesn’t snow every day in Ottawa — or anywhere else in Ontario, for that matter. That said, we can get blanketed with the white stuff any time from October until April. And when we do, the snow can play havoc with highways, railways, airports, and even roofs.

Roofs, you say? One morning, a few years ago, I heard a (very) loud noise coming from the roof of QNX headquarters. When I looked out, this is what I saw — someone cleaning off the roof with a snow blower! So much snow had fallen that the integrity of the roof was being threatened:



When snow like this falls on the road, it can tax the abilities of even the best driver. But what happens when the driver isn’t a person, but the car itself? Good question. Snow and blowing snow can mask lane markers, cover street signs, and block light-detection sensors, making it difficult for an autonomous vehicle to determine where it should go and what it should do. Snow can even trick the vehicle into “seeing” phantom objects.

And it’s not just snow. Off the top of my head, I can think of 4 other phenomena common to Ontario roads that pose a challenge to human and robot drivers alike: black ice, freezing rain, extreme temperatures, and moose. I am only half joking about the last item: autonomous vehicles must respond appropriately to local fauna, not least when the animal in question weighs half a ton.

To put it simply, Ontario would be a perfect test bed for advancing the state of autonomous technologies. So imagine my delight when I learned that the Ontario government has decided to do something about it.

Starting January 1, Ontario will become the first Canadian province to allow road testing of automated vehicles and related technology. The provincial government is also pledging half a million dollars to the Ontario Centres of Excellence Connected Vehicle/Automated Vehicle Program, in addition to $2.45 million already provided.

The government has also installed some virtual guard rails. For instance, it insists that a trained driver stay behind the wheel at all times. The driver must monitor the operation of the autonomous vehicle and take over control whenever necessary.

Testing autonomous vehicles in Ontario simply makes sense, but not only because of the weather. The province also has a lot of automotive know-how. Chrysler, Ford, General Motors, Honda, and Toyota all have plants here, as do 350 parts suppliers. Moreover, the province has almost 100 companies and institutions involved in connected vehicle and automated vehicle technologies — including, of course, QNX Software Systems and its parent company, BlackBerry.

So next time you’re in Ontario, take a peek at the driver in the car next to you. But don’t be surprised if he or she isn’t holding the steering wheel.


A version of this post originally appeared in Connected Car Expo blog.

What does a decades-old thought experiment have to do with self-driving cars?

Paul Leroux
Last week, I discussed, ever so briefly, some ethical issues raised by autonomous vehicles — including the argument that introducing them too slowly could be considered unethical!

My post included a video link to the trolley problem, a thought experiment that has long served as a tool for exploring how people make ethical decisions. In its original form, the trolley problem is quite simple: You see a trolley racing down a track on which five people are tied up. Next to you is a lever that can divert the trolley to an empty track. But before you can pull the lever, you notice that someone is, in fact, tied up on the second track. Do you do nothing and let all 5 people die, or do you pull the lever and kill the one person instead?

The trolley problem has undergone criticism for failing to represent real-world problems, for being too artificial. But if you ask Patryk Lin, a Cal Tech professor who has delivered talks to Google and Tesla on the ethics of self-driving cars, it can serve as a helpful teaching tool for automotive engineers — especially if its underlying concept is framed in automotive terms.

Here is how he presents it:

“You’re driving an autonomous car in manual mode—you’re inattentive and suddenly are heading towards five people at a farmer’s market. Your car senses this incoming collision, and has to decide how to react. If the only option is to jerk to the right, and hit one person instead of remaining on its course towards the five, what should it do?”

Of course, autonomous cars, with their better-than-human driving habits (e.g. people tailgate, robot cars don’t) should help prevent such difficult situations from happening in the first place. In the meantime, thinking carefully through this and other scenarios is just one more step on the road to building fully autonomous, and eventually driverless, cars.

Read more about the trolley problem and its application to autonomous cars in a recent article on The Atlantic.

Speaking of robot cars, if you missed last week's webinar on the role of software when transitioning from ADAS to autonomous driving, don't sweat it. It's now available on demand at Techonline.

The ethics of robot cars

“By midcentury, the penetration of autonomous vehicles... could ultimately cause vehicle crashes in the U.S. to fall from second to ninth place in terms of their lethality ranking.” — McKinsey

Paul Leroux
If you saw a discarded two-by-four on the sidewalk, with rusty nails sticking out of it, what would you do? Chances are, you would move it to a safe spot. You might even bring it home, pull the nails out, and dispose of it properly. In any case, you would feel obliged to do something that reduces the probability of someone getting hurt.

Driver error is like a long sharp nail sticking out of that two-by-four. It is, in fact, the largest single contributor to road accidents. Which raises the question: If the auto industry had the technology, skills, and resources to build vehicles that could eliminate accidents caused by human error, would it not have a moral obligation to do so? I am speaking, of course, of self-driving cars.

Now, a philosopher I am not. I am ready to accept that my line of thinking on this matter has more holes than Swiss cheese. But if so, I’m not the only one with Emmenthal for brain matter. I am, in fact, in good company.

Take, for example, Bryant Walker-Smith, a professor in the schools of law and engineering at the University of South Carolina. In an article in MIT Technology Review, he argues that, given the number of accidents that involve human error, introducing self-driving technology too slowly could be considered unethical. (Mind you, he also underlines the importance of accepting ethical tradeoffs. We already accept that airbags may kill a few people while saving many; we may have to accept that the same principle will hold true for autonomous vehicles.)

Then there’s Roger Lanctot of Strategy Analytics. He argues that government agencies and the auto industry need to move much more aggressively on active-safety features like automated lane keeping and automated collision avoidance. He reasons that, because the technology is readily available — and can save lives — we should be using it.

Mind you, the devil is in the proverbial details. In the case of autonomous vehicles, the ethics of “doing the right thing” is only the first step. Once you decide to build autonomous capabilities into a vehicle, you often have to make ethics-based decisions as to how the vehicle will behave.

For instance, what if an autonomous car could avoid a child running across the street, but only at the risk of driving itself, and its passengers, into a brick wall? Whom should the car be programmed to save? The child or the passengers? And what about a situation where the vehicle must hit either of two vehicles — should it hit the vehicle with the better crash rating? If so, wouldn’t that penalize people for buying safer cars? This scenario may sound far-fetched, but vehicle-to-vehicle (V2X) technology could eventually make it possible.

The “trolley problem” captures the dilemma nicely:



Being aware of such dilemmas gives me more respect for the kinds of decisions automakers will have to make as they build a self-driving future. But you know what? All this talk of ethics brings something else to mind. I work for a company whose software has, for decades, been used in medical devices that help save lives. Knowing that we do good in the world is a daily inspiration — and has been for the last 25 years of my life. And now, with products like the QNX OS for Safety, we are starting to help automotive companies build ADAS systems that can help mitigate driver error and, ultimately, reduce accidents. So I’m doubly proud.

More to the point, I believe this same sense of pride, of helping to make the road a safer place, will be a powerful motivator for the thousands of engineers and development teams dedicated to paving the road from ADAS to autonomous. It’s just one more reason why autonomous cars aren’t a question of if, but only of when.

From ADAS to autonomous

A new webinar on how autonomous driving technologies will affect embedded software — and vice versa

Paul Leroux
When, exactly, will production cars become fully autonomous? And when will they become affordable to the average Jane or Joe? Good questions both, but in the meantime, the auto industry isn’t twiddling its collective thumbs. It’s already starting to build a more autonomous future through active-control systems that can avoid accidents (e.g. automated emergency braking) and handle everyday driving tasks (e.g. adaptive cruise control).

These systems rely on software to do their job, and that reliance will grow as the systems become more sophisticated and cars become more fully autonomous. This trend, in turn, will place enormous pressure on how the software is designed, developed, and maintained. Safety, in particular, must be front and center at every stage of development.

Which brings me to a new webinar from my inestimable colleague, Kerry Johnson. Titled “The Role of a Software Platform When Transitioning from ADAS to Autonomous Driving,” the webinar will examine:
  • the emergence of high-performance systems-on-chip that target ADAS and autonomous vehicle applications
  • the impact of increasing system integration and autonomous technologies on embedded software
  • the need for functional safety standards such as ISO 26262
  • the emergence of pre-certified products as part of the solution to address safety challenges
  • the role of a software platform to support the evolution from ADAS to autonomous driving

If you are tasked with either developing or sourcing software for functional safety systems in passenger vehicles, this webinar is for you. Here are the coordinates:

Wednesday, October 7
1:00pm EDT

Registration Site



Digital instrument clusters and the road to autonomous driving

Guest post by Walter Sullivan, head of Innovation Lab, Silicon Valley, Elektrobit Automotive

Autonomous driving requires new user experience interfaces, always on connectivity, new system architectures and reliable security. In addition to these requirements, the real estate in the car is changing as we move towards autonomous driving, and the traditional display is being replaced by head up displays (HUD), digital instrument clusters, and other screens. The digital cluster is where automakers can blend traditional automotive status displays (such as odometer, speed, etc.) with safety features, entertainment, and navigation, providing a more personalized, safe, comfortable, and enjoyable driving experience.

For autonomous vehicles, the human-machine interface (HMI) will change with the level of autonomy. Until vehicles are fully autonomous, all the traditional functions of the in-car HMI must be covered and driver distraction needs to be minimized. As we progress through piloted drive towards full autonomy, additional functions are taking center stage in the instrument cluster: driver assistance (distance to vehicle in front, speed limit, optimized time to destination/fuel consumption, object detection, etc.).

The digital instrument cluster brings a number of benefits to the driver experience including:
  • Comfort: The more information that a driver has about the route, right before his or her eyes, the more comfortable the drive. Digital clusters that provide map data, not just routing guidance but information on the nearest gas station, traffic, upcoming toll roads, etc., give the most comfort by empowering the driver with the information needed to get to the destination quickly and safely.
  • Safety: Drivers benefit from cars that know what’s on the road ahead. Through electronic horizon-based features, clusters can display “predictive” driver-assistance information that delivers to the driver important messages regarding safety.
  • Entertainment: Consumers are looking for vehicles that allow them to transfer their digital lifestyle seamlessly into the driving experience. The cluster can enable such integration, allowing the driver to control a smartphone using the in-car system, stream music, make phone calls, and more.

As more software and technology enters the car and we move closer to the fully autonomous vehicle, the cluster will continue to be the main platform for HMI. Automakers are challenged to build the most user-friendly, personalized clusters they can, with today’s cars employing advanced visual controls that integrate 3D graphics and animation and even natural language voice control. Drivers will rely more heavily on the cluster to provide them information that ensures their safety and comfort during the ride.

Digital instrument cluster developed using EB technology, as shown in the QNX reference vehicle.

Curious about what this kind of technology looks like? Digital instrument clusters developed using Elektrobit (EB) Automotive software will be displayed at the QNX Software Systems (booth C92) during TU-Automotive Detroit, June 3-4. QNX will feature a demo cluster developed using EB GUIDE that integrates a simulated navigation route with EB street director, plus infotainment and car system data. You can also see EB technology in action in the QNX reference vehicle based on a Jeep Wrangler, in which EB street director and the award-winning EB Assist Electronic Horizon are both integrated in the digital cluster.


Walter Sullivan is head of Elektrobit (EB) Automotive’s newly established Silicon Valley Innovation Lab, responsible for developing and leading the company’s presence in Silicon Valley, as well as building and fostering strategic partnerships around the globe.

Visit Elektrobit here.

Keeping it fresh for 35 years

By Megan Alink, Director of Marketing Communications for Automotive

Recently, my colleagues Paul Leroux and Matt Young showed off a shiny new infographic that enlightens readers to the many ways they encounter QNX-based systems in daily life (here and here). After three-and-a-half decades in business we’ve certainly been around the block a time or two, and you might think things are getting a bit stale. As the infographic shows, that couldn’t be further from the truth here at QNX. From up in the stars to down on the roads; in planes, trains, and automobiles (and boats too); whether you’re mailing a letter or crafting a BBM on your BlackBerry smartphone, the number and breadth of applications in which our customers deploy QNX technology is simply astounding.

For those who like some sound with their pictures, we also made a video to drive home the point that, wherever you are and whatever you do, chances are you’ll encounter a little QNX. Check it out:


“What do you mean, I have to learn how not to drive?”

The age of autonomous driving lessons is upon us.

Paul Leroux
What would it be like to ride in an autonomous car? If you were to ask the average Joe, he would likely describe a scenario in which he sips coffee, plays video games, and spends quality time with TSN while the car whisks him to work. The average Jane would, no doubt, provide an equivalent answer. The problem with this scenario is that autonomous doesn’t mean driverless. Until autonomous vehicles become better than humans at handling every potential traffic situation, drivers will have to remain alert much or all of the time, even if their cars do 99.9% of the driving for them.

Otherwise, what happens when a car, faced with a situation it can’t handle, suddenly cedes control to the driver? Or what happens when the car fails to recognize a pedestrian on the road ahead?

Of course, it isn’t easy to maintain a high level of alertness while doing nothing in particular. It takes a certain maturity of mind, or at least a lack of ADD. Which explains why California, a leader in regulations for autonomous vehicles, imposes restrictions on who is allowed to “drive” them. Prerequisites include a near-spotless driving record and more than 10 years without a DUI conviction. Drivers must also complete an autonomous driving program, the length of which depends on the car maker or automotive supplier in question. According to a recent investigation by IEEE Spectrum, Google offers the most comprehensive program — it lasts five weeks and subjects drivers to random checks.

1950s approach to improving driver
alertness. Source:
 
Modern Mechanix blog

In effect, drivers of autonomous cars have to learn how not to drive. And, as another IEEE article suggests, they may even need a special license.

Ample warnings
Could an autonomous car mitigate the attention issue? Definitely. It could, for example, give the driver ample warning before he or she needs to take over. The forward collision alerts and other informational ADAS functions in the latest QNX technology concept car offer a hint as to how such warnings could operate. For the time being, however, it’s hard to imagine an autonomous car that could always anticipate when it needs to cede control. Until then, informational ADAS will serve as an adjunct, not a replacement, for eyes, ears, and old-fashioned attentiveness.

Nonetheless, research suggests that adaptive cruise control and other technologies that enable autonomous or semi-autonomous driving can, when compared to human drivers, do a better job of avoiding accidents and improving traffic flow. To quote my friend Andy Gryc, autonomous cars would be more “polite” to other vehicles and be better equipped to negotiate inter-vehicle space, enabling more cars to use the same length of road.

Fewer accidents, faster travel times. I could live with that.


2015 approach to improving driver alertness: instrument cluster from the QNX reference vehicle.

New to 26262? Have I got a primer for you

Driver error is the #1 problem on our roads — and has been since 1869. In August of that year, a scientist named Mary Ward became the first person to die in an automobile accident, after being thrown from a steam-powered car. Driver error was a factor in Mary’s death and, 145 years later, it remains a problem, contributing to roughly 90% of motor vehicle crashes.

Can ADAS systems mitigate driver error and reduce traffic deaths? The evidence suggests that, yes, they help prevent accidents. That said, ADAS systems can themselves cause harm, if they malfunction. Imagine, for example, an adaptive cruise control system that underestimates the distance of a car up ahead. Which raises the question: how can you trust the safety claims for an ADAS system? And how do you establish that the evidence for those claims is sufficient?

Enter ISO 26262. This standard, introduced in 2011, provides a comprehensive framework for validating the functional safety claims of ADAS systems, digital instrument clusters, and other electrical or electronic systems in production passenger vehicles.

ISO 26262 isn’t for the faint of heart. It’s a rigorous, 10-part standard that recommends tools, techniques, and methodologies for the entire development cycle, from specification to decommissioning. In fact, to develop a deep understanding of 26262 you must first become versed in another standard, IEC 61508, which forms the basis of 26262.

ISO 26262 starts from the premise that no system is 100% safe. Consequently, the system designer must perform a hazard and risk analysis to identify the safety requirements and residual risks of the system being developed. The outcome of that analysis determines the Automotive Safety Integrity Level (ASIL) of the system, as defined by 26262. ASILs range from A to D, where A represents the lowest degree of hazard and D, the highest. The higher the ASIL, the greater the degree of rigor that must be applied to assure the system avoids residual risk.

Having determined the risks (and the ASIL) , the system designer selects an appropriate architecture. The designer must also validate that architecture, using tools and techniques that 26262 either recommends or highly recommends. If the designer believes that a recommended tool or technique isn’t appropriate to the project, he or she must provide a solid rationale for the decision, and must justify why the technique actually used is as good or better than that recommended by 26262.

The designer must also prepare a safety case. True to its name, this document presents the case that the system is sufficiently safe for its intended application and environment. It comprises three main components: 1) a clear statement of what is claimed about the system, 2) the argument that the claim has been met, and 3) the evidence that supports the argument. The safety case should convince not only the 26262 auditor, but also the entire development team, the company’s executives, and, of course, the customer. Of course, no system is safe unless it is deployed and used correctly, so the system designer must also produce a safety manual that sets the constraints within which the product must be deployed.

Achieving 26262 compliance is a major undertaking. That said, any conscientious team working on a safety-critical project would probably apply most of the recommended techniques. The standard was created to ensure that safety isn’t treated as an afterthought during final testing, but as a matter of due diligence in every stage of development.

If you’re a system designer or implementer, where do you start? I would suggest “A Developer’s View of ISO 26262”, an article recently authored by my colleague Chris Hobbs and published in EE Times Automotive Europe. The article provides an introduction to the standard, based on experience of certifying software to ISO 26262, and covers key topics such as ASILs, recommended verification tools and techniques, the safety case, and confidence from use.

I also have two whitepapers that may prove useful: Architectures for ISO 26262 systems with multiple ASIL requirements, written by my colleague Yi Zheng, and Protecting software components from interference in an ISO 26262 system, written by Chris Hobbs and Yi Zheng.

Driving simulators at CES

CES was just 15 minutes from closing when I managed to slip away from the very busy QNX booth to try out an F1 simulator. Three screens, 6 degrees of freedom, and surround sound came together for the most exciting simulated driving experience I have ever had. I was literally shaking when they dragged me out of the driver’s seat (I didn’t want to stop :-). Mind you, at around $80K for the system, it seems unlikely I will ever own one.

The experience got me thinking about the types of vehicles currently in simulation or in the lab that I fully expect to drive in my lifetime: cars that are virtually impossible to crash, cars that make it painless to travel long distances, and, ultimately, cars that worry about traffic jams so I can read a book.

Re-incarnated: The QNX reference
vehicle.
QNX Software Systems had a very popular simulator of its own at CES this year. You may have seen some details on it already but to recap, it is a new incarnation of our trusty QNX reference vehicle, extended to demonstrate ADAS capabilities. We parked it in front of a 12 foot display and used video footage captured on California’s fabled Highway 1 to provide the closest thing to real-world driving we could create.

The resulting virtual drive showcased the capabilities not only of QNX technology, but of our ecosystem as well. Using the video footage, we provided camera inputs to Itseez’ computer vision algorithms to demonstrate a working example of lane departure warning and traffic sign recognition. By capturing GPS data synchronized with the video footage, and feeding the result through Elektrobit’s Electronic Horizon Solution, we were able to generate curve speed warnings. All this was running on automotive-grade Jacinto 6 silicon from Texas Instruments. LiDAR technology from Phantom Intelligence rounded out the offering by providing collision feedback to the driver.

The lane departure and curve speed warnings in action. Screen-grab from video by Embedded Computing Design.

Meeting the challenge
While at CES, I also had the opportunity to meet with companies that are working to make advanced ADAS systems commercially viable. Phantom Intelligence is one example but I was also introduced to companies that can provide thermal imaging systems and near-infrared cameras at a fraction of what these technologies cost today.

These are all examples of how the industry is rising up to meet the challenge of safer, more autonomous vehicles at a price point that allows for widespread adoption in the foreseeable future. Amazing stuff, really — we are finally entering the era of the Jetsons.

By the way, I can’t remember what booth I was in when I drove the simulator. But I’m willing to bet that the people who experienced the Jeep at CES will remember they were in the QNX booth, seeing technology from QNX and its key partners in this exciting new world.

One day I’ll be Luke Skywalker

Cyril Clocher
What happens when you blend ADAS with infotainment? Guest post by Cyril Clocher, business manager for automotive processors at Texas Instruments

As we all begin preparing for our trek to Vegas for CES 2015, I would like my young friends (born in the 70s, of course) to reflect on their impressions of the first episode of Lucas’s trilogy back in 1977. On my side, I perfectly remember thinking one day I would be Luke Skywalker.

The eyes of young boys and girls were literally amazed by this epic space opera and particularly by technologies used by our heroes to fight the Galactic Empire. You have to remember it was an era where we still used rotary phones and GPS was in its infancy. So you can imagine how impactful it was for us to see our favorite characters using wireless electronic gadgets with revolutionary HMIs such as natural voice recognition, gesture controls or touch screens; droids speaking and enhancing human intelligence; and autonomous vehicles traveling the galaxy safely while playing chess with a Wookiee. Now you’re with me…

But instead of becoming Luke Skywalker a lot of us realized that we would have a bigger impact by inventing or engineering these technologies and by transforming early concepts into real products we all use today. As a result, smartphones and wireless connectivity are now in our everyday lives; the Internet of Things (IoT) is getting more popular in applications such as activity trackers that monitor personal metrics; and our kids are more used to touch screens than mice or keyboards, and cannot think of on-line gaming without gesture control. In fact, I just used voice recognition to upgrade the Wi-Fi plan from my Telco provider.

But the journey is not over yet. Our generation has still to deliver an autonomous vehicle that is green, safe, and fun to control – I think the word “drive” will be obsolete for such a vehicle.

The automotive industry has taken several steps to achieve this exciting goal, including integration of advanced and connected in-car infotainment systems in more models as well as a number of technologies categorized under Advanced Driver Assistance Systems (ADAS) that can create a safer and unique driving experience. From more than a decade, Texas Instruments has invested in infotainment and ADAS: “Jacinto” and TDAx automotive processors as well as the many analog companion chips supporting these trends.

"Jacinto 6 EP" and "Jacinto 6 Ex"
infotainment processor
s
A unique approach of TI is our capability to leverage best of both worlds for non-safety critical features, and to provide a seamless integration of informational ADAS functions into existing infotainment systems so the vehicle better informs and warns the driver. We announced that capability at SAE Convergence in Detroit in October 2014 with the “Jacinto 6 Ex” processor (DRA756), which contains powerful CPU, graphics multimedia, and radio cores with differentiated vision co-processors, called embedded vision engines (EVE), and additional DSPs that perform the complex ADAS processing.

For the TI’s automotive team, the CES 2015 show is even more exciting than in previous years, as we’ve taken our concept of informational ADAS to the next step. With joint efforts and hard work from both TI and QNX teams, we’ve together implemented a real informational ADAS system running the QNX CAR™ Platform for Infotainment on a “Jacinto 6 Ex” processor.

I could try describing this system in detail, but just like the Star Wars movies, it’s best to experience our “Jacinto 6 Ex” and QNX CAR Platform-based system in person. Contact your TI or QNX representative today and schedule a meeting to visit our private suite at CES at the TI Village (N115-N119) or to immerse yourself in a combined IVI, cluster, megapixel surround view, and DLP® based HUD display with augmented reality running on a single “Jacinto 6 Ex” SoC demonstration. And don't forget to visit the QNX booth (2231), where you can see the QNX reference vehicle running a variety of ADAS and infotainment applications on “Jacinto 6” processors.

Integrated cockpit featuring DLP powered HUD and QNX CAR Platform running on a single “Jacinto 6 Ex” SoC.
One day I’ll experience Skywalker’s life as I will no doubt have the opportunity to control an intelligent and autonomous vehicle with my biometrics, voice, and gestures while riding with my family to the movie theater playing chess with my grandkids, not yet a Wookiee.

A need for speed... and safety

Matt Shumsky
Matt Shumsky
For me, cars and safety go hand in hand. Don’t get me wrong, I have a need for speed. I do, after all, drive a 2006 compact with 140 HP (pause for laughter). But no one, and I mean no one, wants to be barreling down a highway in icy conditions at 120 km/hr without working brakes, am I right?

So this begs the question, what’s the best way to design a software system that ensures the adaptive cruise control system keeps a safe distance from the car ahead? Or that tells the digital instrument cluster the correct information to display? And how can you make sure the display information isn’t corrupted?

Enter QNX and the ISO 26262 functional safety standard.

QNX Software Systems is partnering with LDRA to present a webinar on “Ensuring Automotive Functional Safety”. During this webinar, you’ll learn about:
  • Development and verification tools proven to help provide safer automotive software systems
  • How suppliers can develop software systems faster with an OS tuned for automotive safety

Ensuring Automotive Functional Safety with QNX and LDRA
Thursday, November 20, 2014
9:00 am PST / 12:00 pm EST / 5:00 pm UTC

I hope you can join us!

Japan update: ADAS, wearables, integrated cockpits, and autonomous cars

Yoshiki Chubachi
Yoshiki Chubachi
Will the joy of driving be a design criterion for tomorrow’s vehicles? It had better be.

A couple of weeks ago, QNX Software Systems sponsored Telematics Japan in Tokyo. This event offers a great opportunity to catch up with colleagues from automotive companies, discuss technology and business trends, and showcase the latest technology demos. Speaking of which, here’s a photo of me with a Japan-localized demo of the QNX CAR Platform. You can also see a QNX-based digital instrument cluster in the lower-left corner — this was developed by Three D, one of our local technology partners:



While at the event, I spoke on the panel, “Evolving ecosystems for future HMI, OS, and telematics platform development.” During the discussion, we conducted a real-time poll and asked the audience three questions:

1) Do you think having Apple CarPlay and Android Auto will augment a vehicle brand?
2) Do you expect wearable technologies to be integrated into cars?
3) If your rental car were hacked, who would you complain to?

For question 1, 32% of the audience said CarPlay and Android Auto will improve a brand; 68% didn't think so. In my opinion, this result indicates that smartphone connectivity in cars is now an expected feature. For question 2, 76% answered that they expect to see wearables integrated into cars. This response gives us a new perspective — people are looking at wearables as a possible addition to go with ADAS systems. For example, a wearable device could help prevent accidents by monitoring the driver for drowsiness and other dangerous signs. For question 3, 68% said they would complain to the rental company. Mind you, this raises the question: if your own car were hacked, who would you complain to?

Integrated cockpits
There is growing concern around safety and security as companies attempt to grow more business by leveraging connectivity in cars. The trend is apparent if you look at the number of safety- and security-related demos at various automotive shows.

Case in point: I recently attended a private automotive event hosted by Renesas, where many ADAS and integrated cockpit demos were on display. And last month, CEATEC Japan (aka the CES of Japan) featured integrated cockpit demos from companies like Fujitsu, Pioneer, Mitsubishi, Kyocera, and NTT Docomo.

For the joy of it
Things are so different from when I first started developing in-car navigation systems 20 years ago. Infotainment systems are now turning into integrated cockpits. In Japan, the automotive industry is looking at early 2020s as the time when commercially available autonomous cars will be on the road. In the coming years, the in-car environment, including infotainment, cameras and other systems, will change immensely — I’m not exactly sure what cars in the year 2020 will look like, but I know it will be something I could never have imagined 20 years ago.

A panel participant at Telematics Japan said to me, “If autonomous cars become reality and my car is not going to let me drive anymore, I am not sure what the point of having a car is.” This is true. As we continue to develop for future cars, we may want to remind ourselves of the “joy of driving” factor.

Are you ready to stop micromanaging your car?

I will get to the above question. Honest. But before I do, allow me to pose another one: When autonomous cars go mainstream, will anyone even notice?

The answer to this question depends on how you define the term. If you mean completely and absolutely autonomous, with no need for a steering wheel, gas pedal, or brake pedal, then yes, most people will notice. But long before these devices stop being built into cars, another phenomenon will occur: people will stop using them.

Allow me to rewind. Last week, Tesla announced that its Model S will soon be able to “steer to stay within a lane, change lanes with the simple tap of a turn signal, and manage speed by reading road signs and using traffic-aware cruise control.” I say soon because these functions won't be activated until owners download a software update in the coming weeks. But man, what an update.

Tesla may now be at the front of the ADAS wave, but the wave was already forming — and growing. Increasingly, cars are taking over mundane or hard-to-perform tasks, and they will only become better at them as time goes on. Whether it’s autonomous braking, automatic parking, hill-descent control, adaptive cruise control, or, in the case of the Tesla S, intelligent speed adaptation, cars will do more of the driving and, in so doing, socialize us into trusting them with even more driving tasks.

Tesla Model S: soon with autopilot
In other words, the next car you buy will prepare you for not having to drive the car after that.

You know what’s funny? At some point, the computers in cars will probably become safer drivers than humans. The humans will know it, but they will still clamor for steering wheels, brake pedals, and all the other traditional accoutrements of driving. Because people like control. Or, at the very least, the feeling that control is there if you want it.

It’s like cameras. I would never think of buying a camera that didn’t have full manual mode. Because control! But guess what: I almost never turn the mode selector to M. More often than not, it’s set to Program or Aperture Priority, because both of these semi-automated modes are good enough, and both allow me to focus on taking the picture, not on micromanaging my camera.

What about you? Are you ready for a car that needs a little less micromanagement?

Domo arigato, for self-driving autos

Lynn Gayowski
Lynn Gayowski
When talk moves to autonomous cars, Google's self-driving car is often the first project that springs to mind. However, there are a slew of automakers with autonomous or semi-autonomous vehicles in development — Audi, BMW, General Motors, Mercedes-Benz, and Toyota, to name a few. And did you know that QNX has been involved with autonomous projects since 1997?

Let's begin at the beginning. Obviously the first step is to watch the 1983 Mr. Roboto music video. To quote selectively, "I've come to help you with your problems, so we can be free." As Styx aptly communicated with the help of synthesizers, robots have the potential to improve our lives. Current research predicts autonomous cars will reduce traffic collisions and improve traffic flow, plus drivers will be freed up for other activities.

So let's take a look at how QNX has been participating in the progress to self-driving vehicles.



The microkernel architecture of the QNX operating system provides an exemplary foundation for systems with functional safety requirements, and as you can see from this list, there are projects related to cars, underwater robots, and rescue vehicles.

Take a look at this 1997 video from the California Partners for Advanced Transportation Technology (PATH) and the National Automated Highway System Consortium (NAHSC) showing their automated driving demo — the first project referenced on our timeline. It's interesting that the roadway and driving issues mentioned in this video still hold true 17 years later.



We're estimating that practical use of semi-autonomous cars is still 4 years away and that fully autonomous vehicles won't be available to the general public for about another 10 years after that. So stay tuned to the QNX Auto Blog. I'm already envisioning a 30-year montage of our autonomous projects. With a stirring soundtrack by Styx.

Automotive technology

Automotive

Labels

1904 Columbus 1940 Ford 1964 Worlds Fair 1969 Camaro 1969 Dodge Coronet Super Bee 2014 2016 Sales 2017 The Bad 8 2017 The Good 12 3 wheeler 4 G 407 407 ex2 427 AC Cobra 440 six pack 442 4x 4x4 55 Chevy 57 Chevy 5th wheel AAR abandoned abs abuse by law enforcement AC Cobra Acadian accessories accident Acoustic processing Active noise control (ANC) Acura Acura Reviews adaptive cruise control ADAS Adobe AIR ads adventurers advertising aerodynamics Aircraft engines airlines airplane Airstream Alfa Alfa Romeo Alfa-Romeo All Cars Rankings All SUV Rankings All Vehicle Rankings Alpina Alpine AMBR winner ambulance AMC America's greatest photographers American LaFrance amphib AMX AMX-3 Andorra Andrew Poliak Android Andy Gryc anti lock braking system App World Apps Arab-Supercar area controller Ariel-Nomad ARM-based devices art Art Arfons Art Deco artist Asset management system Aston Martin Aston-Martin atv auction Audi Audi Reviews audio Augmented reality Austin Austin Healey Australia Austria Auto Accident Attorney auto car donate auto car donation Auto Donate Auto Donation California Auto hobby books Auto Sales By Brand auto show Auto Story in Pictures Wednesday auto taxi Autocar automobile automobile donation AUTOMOBILE INSURANCE automobile parts automobile safety system automobule donate Autonomous cars Awards awesome B 29 B 52 BAIC Baja racing Baker banners barn find barn finds barnfind barnfinds Barracuda Barris barum BatBerry Batman Batteries battery beautiful engine Beautiful paint before and after Belgium Bello's belly tanker Bentley Best Sellers Best Selling American Cars Best Selling Cars Best Selling Luxury Best Selling SUVs Best Selling Trucks Best Selling Vehicles bicycle bicycles Big 3 Swap Meet big wheel bike messengers bike rack biofuel biography BlackBerry BlackBerry Radar BlackBerry-QNX blink code blink code checkup blink code error blink code troubleshooting Blog blogs BMW BMW Audi Mercedes Benz Daimler jeep GM toyota Chrysler VW volkswagon nissan infiniti ford unique rare Bntley boardtrack Boats boattail Bonneville book review bookmobile Boss 302 Boss 429 brake brakes braking system Brand Marketshare brass era breedlove Brewster Brian Salisbury Bricklin bridge British Britten brochure Bugatti Buick Bulgaria burnout bus Buses buying selling cash tips money advice BYD c C-type Jag Cadillac Cadillac Reviews Camaro Can Am Canada Canada 2016 Sales Canada All Cars Rankings Canada All SUV Rankings Canada All Vehicle Rankings Canada Auto Sales Canada Auto Sales By Brand Canada Best Sellers Canada Compact Car Sales Canada December 2016 Canada Entry Luxury Car Sales Canada February 2017 Canada January 2017 Canada Large Car Sales Canada Large Luxury Car Sales Canada Large Luxury SUV Sales Canada Large SUV Sales Canada March 2017 Canada Midsize Car Sales Canada Midsize Luxury Car Sales Canada Midsize Luxury SUV Sales Canada Midsize SUV Sales Canada Minivan Sales Canada November 2016 Canada October 2016 Canada Premium Sporty Car Sales Canada September 2016 Canada Small Luxury SUV Sales Canada Small SUV Sales Canada Sporty Car Sales Canada Truck Sales Canada Van Sales Canada Worst Sellers car care car chase scene car clubs car collections car collectors Car Donate car donate california car donation Car Donations California Car or the Future car wash carbs carrozzeria cart caterpillar tracked vehicle CCS celebrities celebrity Certicom CES CESA 2012 CESA 3.0 Chademo Challenger Chaparral Charger Charity Charity auction charity car donation Charity Car Donation Program Charity Car With Your Credit Card cheating Checker Chery Chevelle Chevrolet Chevrolet Reviews Chevy 2 China chopper Christian Sobottka Christie Christmas Chrysler Citroen Citroën classics cleaning clip Cloud connectivity CO2 Cobra Cobra Daytona Coupe Cobra Mustang Cobra Torino COE Cogent collection collector College Colombia commercial common rail direct injection Compact Car Sales companies comparison compliment components components of anti-lock braking system concept Concept car Concept team Connected Car construction Consumer Electronics Show consumers Contest convertible Coronet Corvair corvette Corvettes Costa Rica coupe coventry cragar crash crde crdi Croatia Crosley crossover Cruise 4 Kids crypto cryptography CTS Cuda Cunningham Curtiss Aerocar Custom customer satisfaction cutaway display cycle car Cyclone Cyprus Czech Republic dacia Daihatsu Dan Gurney dart Datsun Daytona ddis DDS dealers Dealership Dean Martin December 2016 Degree delivery truck Delorean Delphi Demon Denmark Derek Kuhn design deuce devices Dick Landy dicor Digital instrument clusters digital spark ignition Diner with car theme direction injection Disney display diy Dodge domain controller Donate Donate A Car Tax Deduction Donate Automobile To Charity Donate Car To Charity Tax Deduction Donate Vehicles To Charity donation donation auto car donation vehicles to charity Doug Newcomb Drag racing drag strip Dragonsnake dragsters DREAM drifting Driven Driver distraction driving assistance drunk driver DS dtsi dual carbs dual engined dualie Ducati dump truck dvla E-type Jag ECC economy ECU Ecuador electric electric car Electric cars electromagnetic brake Elliptic Curve Cryptography EMF Emil Dautovic Endurance racing engine engine accessories Engine sound enhancement engines Entry Luxury Car Sales enzo Erskine Essex estate Estonia etc EUCAR Europe EV Business Case Evel Knievel event experience experiment extreme sports video F1 Factor-Aurelio Factory lightweight Factory race car Fairlane Falcon Fast boot Fast-Charging FCA FCEV February 2017 Ferrari Fiat Fiat Botafogo finance Finland fips fire engine fire fighting fire trucks Firebird Firestone firetrucks Fisker flamejob fleet management Ford ford escort Ford Reviews Fordson tractor Forecasts FOTA found around the neighborhood France Franklin Free Car Donation Freescale french fuel fuel injection fuel injection system Fuel Tanker fuel-cell fun Funny car Futurliner gadgets Galpin Ford game garage garner gas mileage gas stations Gasser Gauges GCBC Awards GCBC Most Popular Geely Gene Winfield General Motors German Germany give your car to charity GM GM MyLink GNX Go cart good news Goodwood Goodyear gourmet food vans GPU Graham Gran Prix Grand National Roadster Show 2017 Grand Sport Corvette Graph Great Wall Motors Greece green Green car Gremlin GT GT 350 GT 40 GT 500 gt40 GTO GTX Gulf race car Gullwing Guy Martin Hands-free systems Harley Harley Davidson hauler Hawaii helicopter hemi hemmings Hennessey Henry J hero Hertz hire Hispano-Suiza historical history HMIs Holden Hollywood Holman Moody Honda Honda Reviews Honda Sales Hong Kong Hood ornaments hood scoops Horizon 2020 horse carriage horse wagon host blog info about auto Hot rods Hot Wheels Housekeeping How To Donate How To Donate A Car For Tax Deduction How To Donate Car To Charity how to donation car to charity HRM HTML5 Hudson Hummer humor humour Humvee Hungary Hupmobile Hurst Hurst SC Rambler hybrid Hybrid cars hydrogen hypervisor Hyundai Hyundai Reviews Ian Roussel Iceland ID4 Car ignition IIoT immitation Impala india Indian Indianapolis industry news infiniti Infiniti Reviews Info infographic informative Infotainment Injury Lawyer Innotrans innova innovation innovative instrument panel insurance intake Intel interior International Harvester Internet of Things Internet radio invitation IoT Ireland iris iris details iris engine details iris technical Isetta Iskenderian Isky Isle of Man ISO 26262 Israel issues Isuzu Italian Italy ITS ITU IVI Jaguar January 2017 Japan Japanese Javelin Jay Leno Jean-François Tarabbia Jeep Jeep Wrangler JLR John D'Agostino John Deere John Wall Justin Moon jv Kaivan Karimi Kandi kawasaki Ken Block Kerry Johnson Kia kids Kim Cairns Kissel Kombi Kroy Zeviar Kurtis La Carrera Panamerica lace paint Lamborghini Lamborghini Revuelto Lancia Land Cruiser Land Rover Land Rover Sales land speed record holder Land-Rover Large Car Sales Large Luxury Car Sales Large Luxury SUV Sales Large SUV Sales Larry Wood LaSalle Latvia launch law enforcement lawnmower laws Le Mans legends Leno Lexus license plates Lidar Life Insurance limited Lincoln Lincoln MKZ Linda Campbell Linda Vaughn links lists Lithuania live Loans Locomobile logging train logging trucks Lola London to Brighton Looking for EV's Los Angeles Lotus lowrider LSR Luxembourg luxury Lyft Lynn Gayowski Mach 1 machine shop Mack Mad Max magazine magazines magic iris mags Malaysia March 2017 Mario Andretti Mark Donohue marketing Marketshare Maserati Matt Watson Maverick Mazda Mazda Reviews MB McLaren mechanic Megan Alink meme Memory Lane Men Micro Mercedes Mercedes Benz Mercedes-Benz Mercer Cobra Mercury Metallica Metro Mexico Miata microkernal Midsize Car Sales Midsize Luxury Car Sales Midsize Luxury SUV Sales Midsize SUV Sales Military Miller race car mini mini bike miniature Minivan Sales MirrorLink mission-critical Mitsubishi Miura MMI Mobile connectivity Mobile World Congress mod top Model Model A model T modifications Momo Monaco Monster Truck Moon Moon eyes Mopar Mopar parts Morgan Morocco morons mot Motor shows motor wheel Motorcycle Motorcycles motorhomes Mouse movie movies mpv Multicore Munsters Muntz muscle cars musclecars museum music video Mustang NAIAS Nancy Young Nascar Nash Navigation naza neglec neglected Netherlands new tv show New York New Zealand news ni Nissan Nissan Reviews Nomad Norway nos nose art Nova November 2016 Nurburgring Object Management group October 2016 off roading offenhauser Oldsmobile OMG Online College OnStar Opel Open source Open standards OpenGL ES option orders original owner Ormond Beach land speed racing pace car Packard Pagani Paige pamphlet panel paint Paris to Peking race parking parts Patryk Fournier Paul Leroux Paul Newman Paul Sykes Pebble Beach pedal car perodua personal Peter McCarthy petrol petroliana Peugeot Phoenix Injury photographer photography pics pictures Pierce Arrow Pike's Peak Pinin Farina pinstriping Pit row Pits Pixar PKI plank road PlayBook Plymouth Point Grey Camera Poland pole wheel police Polysynch Pontiac Porsche Porsche 917 Porsche Carrera Portugal POSIX pre 1930's gas station Premium Sporty Car Sales President of the USA Preview prices prius project prooject Proton prototype PSA Peugeot Citroen public key cryptography Pullman QNX QNX CAR QNX Garage QNX OS Qualcomm quiz quote race cars racing racing. LSR Radar radio Raid Data rail railcars railroad ralliart Rally rallying Ram range rover rant Rapid Transit System advertsing rare Real time Innovations recall recommended shop record setter Red Bull Sports Reference vehicle Reliability Rémi Bastien RemoteLink Renault Renesas Renntransporter rentals REO repair reports resarch research restoration restoration shop review Richard Bishop Ridler Award Winner rims river bank cars road and highway Road Runner roadster Robot OS Robot wars Roewe Roger Penske Rolls Royce Romain Saha Romania ROS Roth RTI RTI Connext rumble seat Russia Ruxton RV Safety Safety systems safety-certified sales Sales By Model Sales Stats samba sampan Saoutchik Satellite satnav Scaglietti scallops Scat Pack SCCA racecar School bus sci-fi Scooter SCORE Baja trucks Scott Pennock Scout sculpture Security sedan segway semi sensor extension cable sensor fusion September 2016 service service repair automotive vehicle car buying selling mission statement blog free broker shay drive locomotive Shelby shifter shop Show cars sidecars signs skateboarding Skoda slicks slingshot dragster Slovakia Slovenia Small Luxury SUV Sales Small SUV Sales Smart Smartphones snow machines snowmobile Soapbox South Africa South Korea Sox and Martin Spain spare tire spark ignition spark plug spark plugs Spatial auditory displays special edition Mustangs Speech interfaces speed limit Speed Record speedfest speedster sports car sports cars Sporty Car Sales spy shots spyker Sri Lanka SS SS/AH Stagecoach Stanley Station Wagon steam locomotive steam powered steam shovel steampunk steering wheel Steve McQueen Stig Stirling Moss Stolen streamliner street cars Street Van studebaker stunt stunts Stutz Stutz Blackhawk Subaru Sunbeam Super Bee Super Stock Superbird Supercar supercharger survey suv Suzuki Sweden Swift Switzerland System development Life Cycle Tablets Tach takeover tank tata tata magic iris tata vehicles tax Tax Deduction For Car Donation taxi taxi cab TCS tdi teardrop technical technology Telematics Telematics Detroit Telematics Update tempo Tempo Matador Terlingua Racing Team Terry Staycer Tesla test testdrive Texas Instruments The Race Of Gentlemen Thomas Bloor thoughts three wheeler Thunderbird ticket Tiger Tim Neil Tina Jeffrey tips tires tool tool kit toolbox tools Top Gear top ten list Torino tour bus tourbus towtruck Toyota Toyota Entune Toyota Reviews tractor trailer train train wreck trains Trans Am transmission Transporter Traval trike Triumph trivia trolley Troy Trepanier truck Truck Sales trucking trucks Tucker turbocharger turbojet turbonique Turkey tv tv cars twin spark type 1 type 2 tyres UAE Uber UK UK Auto Sales UK Best Sellers uk market Ukraine Unimog unique University of Waterloo Unser unusual unveil upgrade US US 2016 Sales US All Cars Rankings US All SUV Rankings US All Vehicle Rankings US Auto Sales US Auto Sales By Brand US Best Sellers US Compact Car Sales US December 2016 US Entry Luxury Car Sales US February 2017 US January 2017 US Large Car Sales US Large Luxury Car Sales US Large Luxury SUV Sales US Large SUV Sales US March 2017 US Midsize Car Sales US Midsize Luxury Car Sales US Midsize Luxury SUV Sales US Midsize SUV Sales US Minivan Sales US Navy US November 2016 US October 2016 US September 2016 US Small Luxury SUV Sales US Small SUV Sales US Sporty Car Sales US Truck Sales US US Auto Sales US Van Sales US Worst Sellers USA used cars V2X van Van Sales vauxhall VeDeCoM Vehicle Donation California Velodyne Vespa Video vintage vintage racing Virtual mechanic Virtualization VOIP Guide Volkswagen Volkswagen Reviews Volkswagen Sales Volvo Von Dutch vote VW VW bug W3C wagon train wall of death washer washer fluid Watson's Webinars website what is donation what is it wheel speed sensor wheelchair White williams Willys windshield washer wing Wireless framework women woodlight headlights Woody work truck working principle of anti-lock braking system workshop World Worst Sellers wreck Wrongful Death WW1 WW2 XK SS Yoram Berholtz Yoshiki Chubachi Z 11 Z-28 Z28 zamboni ZL1 Zotye