Automotive
Paul Leroux |
When plugging a media device into a car’s head unit, most users expect immediate access to the device content; they also want to browse the content by metadata, such as genre, title, or artist. To present this content, the head unit must perform metadata synching. The question is, how can the head unit make the content instantly available, even when the media device contains thousands of files that may take many seconds or even minutes to fully synchronize?
To complicate matters, users often want to switch from one media source to another. For instance, a user listening to music stored on a DLNA device may ask the head unit to switch to an Internet radio station. From the user’s perspective, the switch should be fast, simple, and intuitive.
Handling device attachments (and detachments) gracefully. |
Handling scenarios like these is the job of the QNX CAR Platform’s multimedia architecture.
Architecture at a glance
The multimedia architecture integrates several software components to automatically detect media devices, synchronize metadata with media databases, browse the contents of devices, and, of course, play audio and video files. Together, these components form three layers:
- Human machine interface, or HMI
- Multimedia components
- OS services
Let’s look at each of these layers in turn, starting with the HMI.
At the top of the HMI layer, you’ll see the Media Player, a reference application that allows end-users to control media browsing and playback. Developers can customize this player or write their own player apps, using APIs provided by the QNX CAR Platform.
The Media Player comes in two flavors, HTML5 and Qt 5. To communicate with the architecture’s multimedia engine (mm-player), the HTML5 version uses the car.mediaplayer JavaScript API while the Qt version uses the QPlayer library. In addition to these interfaces, custom apps can use the multimedia engine’s C API. All three interfaces — car.mediaplayer, QPlayer, and C API — provide an abstraction layer that allows a media player app to:
- retrieve a list of accessible media sources: local drives, USB storage devices, iPods, etc.
- retrieve track metadata: artist name, album name, track title, etc.
- start and stop playback
- jump to a specific track
- handle updates in playback state, media sources, and track position
The interfaces that provide access to these operations aren’t specific to any device type, so player apps can work with a wide variety of media hardware.
The media player can quickly access and display a variety of metadata (artist name, album name, track title, etc.) stored in a small-footprint SQL database. |
Multimedia components layer
If you look at the top of the multimedia components layer, you’ll see a box labeled mm-player; this is the architecture’s media browsing and playback engine. The mm-player does the dirty work of retrieving metadata, starting playback, jumping to a specific track, etc., which makes custom player apps easier to design. It also supports a large variety of media sources, including:
- local drives
- USB storage devices
- Apple iPod devices
- DLNA devices, including phones and media players
- MTP devices, including PDAs and media players
- devices paired through Bluetooth
To perform media operations requested by a client media player, mm-player works in concert with several lower-level components that help navigate media-store file systems, read metadata from media files, and manage media flows during playback. The components include a series of plugins (POSIX, AVRCP, DLNA, etc.) that interface with different device types. For instance, let’s say you insert an SD card. The POSIX plugin supports this type of device, so it will learn of the insertion and inform mm-player of the newly connected media source; it will also support any subsequent media operations on the SD card.
If you look again at the diagram, you’ll see several other components that provide services to mm-player. These include:
- mm-detect — discovers media devices and initiates synchronization of metadata
- mm-sync — synchronizes metadata from tracks and playlists on media devices into small-footprint SQL databases called QDB databases
- mm-renderer — plays audio and video tracks, and reports playback state
- io-audio — starts audio device drivers to enable the output of audio streams
OS services layer
The lowest layer of the multimedia architecture includes device drivers and protocol stacks that, among other things, detect whether the user has inserted or removed any media device. The following diagram summarizes what happens when one of these services detects an insertion:
- User inserts the device.
- The corresponding driver or protocol stack informs device publishers of the insertion.
- The publishers write the device information to Persistent Publish Subscribe (PPS) objects in a directory monitored by the mm-detect service. (Read my previous posts here and here to learn how QNX PPS messaging enables loosely coupled, easy-to-extend designs.)
- To start synchronizing the device’s metadata, mm-detect loads the device’s QDB database into memory and passes the device’s mountpoint and database name to mm-sync.
- mm-sync synchronizes the metadata of all media files on the device.
- mm-sync uses media libraries to read file paths and other information from media tracks found on the device. It then copies the extracted metadata into the appropriate database tables and columns. Applications can then query the QDB database to obtain metadata information such as track title and album name.
These steps may describe how the architecture detects and synchronizes with devices, but they can't capture the efficiency of the architecture and how it can deliver a fast, responsive user experience. For that, I invite you to check out this video on the QNX CAR Platform. The section on multimedia synchronization starts at the 1:32 mark, but I encourage you to watch the whole thing to see how the platform performs multimedia operations while concurrently managing other tasks:
Media browsing and playback
I’ve touched on how the multimedia architecture automatically detects and synchronizes devices. But of course, it does a lot more, including media browsing and media playback. To learn more about these features, visit the QNX CAR Platform documentation on the QNX website.
Previous posts in the QNX CAR Platform series:
- A question of getting there — wherein I examine how the platform gives customers the flexibility to choose from a variety of navigation solutions
- A question of architecture — wherein I discuss how the platform simplifies the challenge of integrating multiple disparate technologies, from graphics to silicon
- A question of concurrency — wherein I address the a priori question: why does the auto industry need a platform like QNX CAR in the first place?