Showing posts with label Navigation. Show all posts
Showing posts with label Navigation. Show all posts

Monday, June 29, 2015

QNX-based nav system helps Ford SUVs stay on course down under

Paul Leroux
This just in: SWSA, a leading electronics supplier to the Australian automotive industry, and NNG, the developer of the award-winning iGO navigation software, have created a QNX-based navigation system for Ford Australia. The new system has been deployed in Ford Territory SUVs since June of this year.

To reduce driver distraction, the system offers a simplified user interface and feature set. And, to provide accurate route guidance, the system uses data from an internal gyroscope and an external traffic message channel, as well as standard GPS signals. Taking the conditions of local roads into account, the software provides a variety of alerts and speed-camera warnings; it also offers route guidance in Australian English.

The navigation system is based on the iGO My way Engine, which runs in millions of navigation devices worldwide. To read NNG's press release, click here.


SWSA's new nav system for the Ford Territory is based on the Freescale
i.MX31L processor, QNX Neutrino RTOS, and iGO My way Engine.

 

Saturday, June 27, 2015

AUTOMOBILE HTML5 and the software engineer

HTML5 appears to have a number of benefits for consumers and car manufacturers. But what is often good for the goose is not necessarily good for the developer. Talking to the guys in the trenches is critical to understanding the true viability of HTML5.

Andy Gryc and Sheridan Ethier, manager of the automotive development team at QNX, pair up for a technical discussion on HTML5. They explore whether this new technology can support rich user interfaces, how HTML5 apps can be blended with apps written in OpenGL, and if interprocess communication can be implemented between native and web-based applications.

So without further ado, here’s the latest in the educational series of HTML5 videos from QNX.



This interview of Sheridan Ethier is the third in a series from QNX on HTML5.

QNX, AutoNavi collaborate to provide in-car navigation for automakers in China

Map database offers 20 million points of interest

Paul Leroux
This just in: QNX has announced that it is partnering with AutoNavi, a leading provider of digital map content and navigation solutions in China, to integrate AutoNavi’s technology into the QNX CAR platform.

AutoNavi offers a digital map database that covers approximately 3.6 million kilometers of roadway and over 20 million points of interest across China. By supporting this database, the QNX CAR platform will enable automotive companies to create navigation systems optimized for the Chinese market and users.

Said Yongqi Yang, executive vice president of automotive business, AutoNavi, “as a leading global provider of vehicle infotainment software platforms, QNX is not only a technology leader, but also a design concept innovator in enhancing vehicle flexibility — infotainment designs based on the QNX CAR Platform can be quickly customized.”

For more information on this partnership, read the press release. And to learn more about AutoNavi, visit their website.

Monday, June 22, 2015

Telematics China — closing out the year with a get-together in Shanghai

Guest post by Peter McCarthy of the QNX global partnerships team

Peter McCarthy
Is it November already? Time flies when you’re busy. And on the subject of flying, I’ll soon be on a plane to Shanghai, where our friends at Telematics China are hosting what promises to be a great automotive event from December 4 to 6. The organizers have been instrumental in bringing together companies in the industry and a great support to QNX with our own automotive events.

Back in August, QNX held an automotive summit in Shanghai. The success of this event owed a lot to partners like AutoNavi, a leader in the Chinese navigation market that is bringing its digital map content and navigation software to the QNX CAR Platform. The AutoNavi folks delivered a great presentation on the future of in-vehicle services and navigation, and I am sure we will continue these discussions when we meet at the Telematics China event.

When I scroll through the list of sponsors, exhibitors, and presenters at Telematics China, I know for sure my days and nights will be busy — but more importantly, filled with conversations with all the right people. So if you’re attending the event, please reach out to your QNX contacts and make time to meet. We look forward to seeing you there.



About Peter
When he isn't talking on oversized mobile phones, Peter McCarthy serves as director of global partnerships at QNX Software Systems, where he is responsible for establishing and fostering partnerships with technology and services companies in all of the company's target industries.

Sunday, June 21, 2015

ZENRIN Datacom integrates mobile navigation app with QNX CAR Platform

Don't know if you've noticed, but a variety of navigation software vendors have been integrating their solutions with the QNX CAR Platform for Infotainment. In the last few months alone, QNX has announced partnerships with Nokia HERE, Kotei Informatics, and AISIN AW — this in addition to its longstanding partnerships with navigation leaders like Elektrobit, TCS, and TeleNav.

The new partnerships are a boon to automakers and Tier 1 suppliers, especially those that target multiple geographies. More than ever, these companies can choose the navigation solution, or solutions, best suited to a given country or region.

The good news continues with ZENRIN DataCom, a leading provider of mapping services and products from Japan. ZENRIN is now integrating its Its-mo NAVI [Drive] 2015 application — which offers fuel prices, nearby parking spots, and other location-based features — with the QNX CAR Platform. In fact, ZENRIN and QNX demonstrated this integration last week at the Smartphone Japan conference in Tokyo.

The choice of venue may seem surprising, but it makes sense: Its-mo NAVI [Drive] is a smartphone app that, thanks to the collaboration between ZENRIN and QNX, can now run on head units as well. More to the point, this integration illustrates the benefit of building support for mobile app environments into a car infotainment platform: automakers can tap into a much larger developer community.

A spokesperson from ZENRIN DataCom says it best: “The automotive market in Japan and the rest of Asia is a vibrant and compelling environment for app developers but market volume is significantly lower than that for smartphones. A cross-platform concept is key as it enables apps to run on both smartphones and vehicle head units with minimal changes. The QNX CAR Platform, with its rich support for mobile application environments, is a very attractive feature for app developers in the mobile world.”

If you’d like more about ZENRIN and its navigation app, I invite you to read the press release and visit the ZENRIN website.

Friday, June 19, 2015

AUTOMOBILE In-car displays you hear, rather than see

We still have a lot in common with our caveman ancestors. (Yes, I know, they didn't all live in caves. Some lived in forests, others in savannahs, and still others in jungles. But I'm trying to make a point, so bear with me!)

Take, for example, our sense of hearing. At one time, we used auditory cues to locate prey or, conversely, avoid becoming prey. If a cave bear growled, getting a fix on the location of the growl could mean the difference between life and death. At the very least, it helped you avoid running directly into the bear's mouth.

Kidding aside, the human auditory system has a serious ability to fix the location, direction, and trajectory of objects, be they cave bears or Buicks. And it's an ability that's been honed from time immemorial. So why not take advantage of it when creating user interfaces for cars?

Which brings us to spatial auditory displays. In a nutshell, these displays allow you to perceive sound as coming from various locations in a three-dimensional space. Deployed in a car, they can help you intuitively identify voices and sources of instructions, and help pinpoint the location and relative trajectory of danger. They can also improve reaction times to application prompts and potentially hazardous events.
Interested in this topic? Learn more in Scott Pennock's ECD article, "Spatial auditory displays: Reducing cognitive load and improving driver reaction times."

I know, that's a lot to take in. So let's look at an example.

Locating the emergency vehicle, without really trying
Have you ever been cruising along when, suddenly, you hear an ambulance siren? I don't know about you, but I often spend time figuring out where, exactly, the ambulance is coming from. And I don't always get it right. That's called a location error.

Such errors can occur for a variety of reasons. For example, if the ambulance is approaching from the right, but your left window is open and a building on the left is reflecting sound from the siren, you might make the mistake of thinking that the ambulance is approaching from the left. Your mind realizes, quite correctly, that the sound is coming from the left, but the environment is conspiring to mask where the sound is actually coming from.

A spatial auditory display can help address this problem by controlling the acoustic cues you hear. The degree to which the display can do this depends, in part, on the hardware employed. For example, a display based on a large array of loudspeakers can provide more location information than one based on two loudspeakers.

In any case (and this is important), the display can help you determine the location more quickly and with less cognitive load — which means you may have more brain cycles to respond to the situation appropriately.


Helping the driver locate and track an emergency vehicle

A slight right, not a sharp rightI'm only scratching the surface here. Spatial auditory displays can, in fact, help improve all kinds of driving activities, from engaging in a handsfree call to using your navigation system.

For example, rather than simply say "turn right", the display could emit the instruction from the right side of the vehicle. It could even use apparent motion of the auditory prompt to convey a slight right as opposed to a sharp right.

But enough from me. To learn more about spatial auditory displays, check out a new article from my colleague Scott Pennock, whose knowledge of spatial auditory displays far surpasses mine. The article is called Spatial auditory displays: Reducing cognitive load and improving driver reaction times, and it has just been published by Embedded Computing Design magazine.

AUTOMOBILE A question of getting there

The third of a series of posts on the QNX CAR Platform. In this installment, we turn to a key point of interest: the platform’s navigation service.

From the beginning, we designed the QNX CAR Platform for Infotainment with flexibility in mind. Our philosophy is to give customers the freedom to choose the hardware platforms, application environments, user-interface tools, and smartphone connectivity protocols that best address their requirements. This same spirit of flexibility extends to navigation solutions.

For evidence, look no further than our current technology concept car. It can support navigation from Elektrobit:



from Nokia HERE:



and from Kotei Informatics:



These are but a few examples. The QNX CAR Platform can also support navigation solutions from companies like AISIN AW, NavNGo, TCS, TeleNav, and ZENRIN DataCom, enabling automakers and automotive Tier 1 suppliers to choose the navigation solution, or solutions, best suited to the regions or demographics they wish to target. (In addition to these embedded solutions, the platform can also provide access to smartphone-based navigation services through its support for MirrorLink and other connectivity protocols — more on this in a subsequent post.)

Under the hood
In our previous installment, we looked at the QNX CAR Platform’s middleware layer, which provides infotainment applications with a variety of services, including Bluetooth, radio, multimedia discovery and playback, and automatic speech recognition. The middleware layer also includes a navigation service that, true to the platform’s overall flexibility, allows developers to use navigation engines from multiple vendors and to change engines without affecting the high-level navigation applications that the user interacts with.

An illustration is in order. If you look the image below, you’ll see OpenGL-based map data rendered on one graphics layer and, on the layer above it, Qt-based application data (current street, distance to destination, and other route information) pulled from the navigation engine. By taking advantage of the platform’s navigation service, you could swap in a different navigation engine without having to rewrite the Qt application:



To achieve this flexibility, the navigation service makes use of the QNX CAR Platform’s persistent/publish subscribe (PPS) messaging, which cleanly abstracts lower-level services from the higher-level applications they communicate with. Let's look at another diagram to see how this works:



In the PPS model, services publish information to data objects; other programs can subscribe to those objects and receive notifications when the objects have changed. So, for the example above, the navigation engine could generate updates to the route information, and the navigation service could publish those updates to a PPS “navigation status object,” thereby making the updates available to any program that subscribes to the object — including the Qt application.

With this approach, the Qt application doesn't need to know anything about the navigation engine, nor does the navigation engine need to know anything about the Qt app. As a result, either could be swapped out without affecting the other.

Here's another example of how this model allows components to communicate with one another:
  1. Using the system's human machine interface (HMI), the drivers asks the navigation system to search for a point of interest (POI) — this could take the form of a voice command or a tap on the system display.
  2. The HMI responds by writing the request to a PPS “navigation control” object.
  3. The navigation service reads the request from the PPS object and forwards it to the navigation engine.
  4. The navigation engine returns the result.
  5. The navigation service updates the PPS object to notify the HMI that its request has been completed. It also writes the results to a database so that all subscribers to this object can read the results.
By using PPS, the navigation service can make details of the route available to a variety of applications. For instance, it could publish trip information that a weather app could subscribe to. The app could then display the weather forecast for the destination, at the estimated time of arrival.

To give developers a jump start, the QNX CAR Platform comes pre-integrated with Elektrobit’s EB street director navigation software. This reference integration shows developers how to implement "command and control" between the HMI and the participating components, including the navigation engine, navigation service, window manager, and PPS interface. As the above diagram indicates, the reference implementation works with both of the HMIs — one based on HTML5, the other based on Qt — that the QNX CAR Platform supports out of the box.


Previous posts in the QNX CAR Platform series:

Thursday, June 18, 2015

The ultimate show-me car

The fifth installment in the CES Cars of Fame series. Our inductee for this week: a most bodacious Bentley.

It's one thing to say you can do something. It's another thing to prove it. Which helps explain why we create technology concept cars.

You see, we like to tell people that flexibility and customization form the very DNA of the QNX CAR Platform for Infotainment. Which they do. But in the automotive world, people don't just say "tell me"; they say "show me". And so, we used the platform to transform a Bentley Continental GT into a unique concept car, equipped with features never before seen in a vehicle.

Now here's the thing. This is the same QNX CAR Platform found in the QNX reference vehicle, which I discussed last week. But when you compare the infotainment systems in the two vehicles, the differences are dramatic: different features, different branding, different look-and-feel.

The explanation is simple: The reference vehicle shows what the QNX CAR Platform can do out of the box, while the Bentley demonstrates what the platform lets you do once you add your imagination to mix. One platform, many possibilities.

Enough talk; time to look at the car. And let's start with the exterior, because wow:



The awesome (and full HD) center stack
And now let's move to the interior, where the first thing you see is a gorgeous center stack. This immense touchscreen features a gracefully curved surface, full HD graphics, and TI’s optical touch input technology, which allows a physical control knob to be mounted directly on the screen — a feature that’s cool and useful. The center stack supports a variety of applications, including a 3D navigation system from Elektrobit that makes full use of the display:



At 17 inches, the display is big enough to display other functions, such as the car’s media player or virtual mechanic, and still have plenty of room for navigation:



The awesome (and very configurable) digital instrument cluster
The instrument cluster is implemented entirely in software, though you would hardly know it — the virtual gauges are impressively realistic. More impressive still is the cluster’s ability to morph itself on the fly. Put the car in Drive, and the cluster will display a tach, gas gauge, temperature gauge, and turn-by-turn directions — the cluster pulls these directions from the center stack’s navigation system. Put the car in Reverse, and the cluster will display a video feed from the car’s backup camera. You can also have the cluster display the current weather and current sound track:



The awesome (and just plain fun) web app
The web app works with any web browser and allows the driver to view data that the car publishes to the cloud, such as fluid levels, tire pressure, brake wear, and the current track being played by the infotainment system. It even allows the driver to remotely start or stop the engine, open or close windows, and so on:



The awesome (and nicely integrated) smartphone support
The Bentley also showcases how the QNX CAR Platform enables advanced integration with popular smartphones. For instance, the car can communicate with a smartphone to stream music, or to provide notifications of incoming email, news feeds, and other real-time information — all displayed in a manner appropriate to the automotive context. Here's an example:



The awesome everything else
I’ve only scratched the surface of what the car can do. For instance, it also provides:

  • Advanced voice rec — Just say “Hello Bentley,” and the car’s voice recognition system immediately comes to life and begins to interact with you — in a British accent, of course.
     
  • Advanced multimedia system — Includes support for Internet radio.
     
  • Video conferencing with realistic telepresence — Separate cameras for the driver and passenger provide independent video streams, while fullband voice technology from QNX offers expanded bandwidth for greater telepresence.
     
  • LTE connectivity — The car features an LTE radio modem, as well as a Wi-Fi hotspot for devices you bring into the car.

Moving pictures
Okay, time for some video. Here's a fun look at the making of the car:



And here's a run-through of the car's many capabilities, filmed by our friends at TI during 2015 CES:





Tuesday, June 16, 2015

QNX, NVIDIA team up to deliver infotainment solutions

Today, at SAE Convergence, QNX announced that it is working with graphics leader NVIDIA to bring infotainment solutions to the automotive market. As part of this initiative, the companies will integrate support for the NVIDIA Tegra processor into the QNX CAR 2 application platform.

The Tegra system-on-chip is the size of thumbnail, yet it incorporates a quad-core ARM CPU and a GeForce GPU, as well as dedicated audio, video, and image processors.

The NVIDIA Tegra visual
computing module
“QNX Software Systems and NVIDIA have a proven track record of delivering on production programs for Audi... and we’re excited to add support for Tegra to the latest generation of our automotive platform,” said Linda Campbell, QNX director of strategic alliances.

Speaking of Audi, NVIDIA is bringing an Audi A6 to SAE Convergence, equipped with an infotainment system powered by technology from QNX and NVIDIA. The system bristles with high-end features, including 3D navigation with Google Maps and Google Earth, as well as natural voice recognition.

For more information on this announcement, read the press release, and for more information on QNX activities at SAE Convergence, visit our Convergence overview page.


Monday, June 15, 2015

Making the smartphone connection: The state of automotive navigation in Japan

A guest post from Yoshiki Chubachi, the automotive business development manager for QNX Software Systems in Japan

Yoshiki Chubachi
Yoshiki Chubachi
The market for navigation systems in Japan grew rapidly until 2006, but since 2007 the yearly volume has reached the saturation point, at about 2.9M units. For instance, in 2008, consumers purchased 900k after-market systems, 1.1M dealer-installed systems, and 909k factory-installed systems. In 2010, those numbers had changed slightly: 1.01M after-market systems, 1.03M dealer-installed systems, and 858k factory-installed systems (source: Yano Research Institute).

That said, the market is starting to experience a shift from after-market to factory-installed devices. Automakers and their tier one suppliers are struggling to differentiate their products by implementing value-added features.

To get a feel for current navigation trends in Japan, let’s look at some notable after-market products that shipped in 2015. As you'll see, smartphones are exerting a major influence on this market, both in terms of system design and user features:

Pioneer AVIC-VH09CS — This high-end system combines augmented reality technology with a front-view camera, overlaying your route on a live video of the road. It also implements a collision warning system by measuring the distance of the car ahead. Other features include terrestrial digital TV (full HD and 1seg), DVD video, AM-FM, CD and SD music, iPod connectivity, and music ripping and encoding.

Clarion NX501 — The smartphone world seems to drive navigation trends, and the Clarion NX501 is no exception. It offers a touchscreen UI that supports swipes, flicks, and other finger gestures similar to those found in smartphones and tablets. Suzuki factory-installed systems also use the type of user interface.

Fujitsu-Ten AVN-F01i — This system comes with three bundled iPhone applications: Twitter Drive (combines tweets with location data), Where is My Car (uses augmented reality to show your parking location on the phone screen; great for finding your car in large parking lots); and News Reader (allows the system’s text-to-speech engine to read out news articles). The system connects to the phone through Bluetooth.

Panasonic CN-H500WD — The system also lets you use finger swipes to operate navigation and audio functions, including a scrolling map. It comes with a smartphone application that provides POI search, which is downloaded to the navigation system through Bluetooth.

Mitsubishi NR-MZ50 — This system provides an “OpenInfo” service based on Pioneer’s Smartloop system, which provides traffic data from a Pioneer server. VICS (Vehicle Information and Communication System) is a popular traffic data service in Japan that is similar to the RDS-TMC standard, but its coverage is limited to main highways. The smartphone receives traffic data, derived from anonymous traffic probe information, wherever the VICS service isn't supported. Information from the phone is transmitted to the navigation system through Bluetooth.

Connectivity between navigation systems and smartphones remains an issue in Japan. Conventional cell phones are equipped with the Bluetooth DUN profile, which enables data communication between the nav system and the phone, but unfortunately, some carriers still don’t support this profile. Until they do, lack of connectivity will remain a roadblock.

Nonetheless, using smartphones to deliver applications and the user experience has become a major trend in Japan’s navigation systems. Some automotive tier one suppliers, such as Pioneer, already provide navigation applications on the phone. The QNX CAR 2 application platform, with its mobile connectivity features and auto-centric HTML5 framework, offers an ideal foundation for enabling this approach.

Saturday, June 13, 2015

What’s HTML5 got to do with automotive?

There’s been a lot of noise lately about HTML5. A September 2015 report by binvisions shows that search engines and social media web sites are leading the way toward adoption: Google, Facebook, YouTube, Wikipedia, Twitter, and plenty more have already transitioned to HTML5. Some are taking it even further: Facebook has an HTML5 Resource Center for developers and the Financial Times has a mobile HTML5 version of their website.

It won’t be long before HTML5 is ubiquitous. We think automakers should (and will) use it. 

To elucidate the technology and its relevance, we’ve created a series of educational videos on the topic. Here is the first in that series. Interviews with partners, customers, and industry gurus will soon follow. 



This simple overview is the first in a series from QNX on HTML5. (Personally I like the ending the best.)

Friday, June 12, 2015

Crowd-sourced maps: the future of in-car navigation?

Guest post by Daniel Gast, innovation manager, Elektrobit Automotive

Crowdsourcing has become a major trend. Even McDonald’s has been getting into the act, asking consumers to submit new ideas for burgers. In 2015 the company’s “My Burger 3.0” campaign elicited an enormous response in Germany, with more than 200,000 burger ideas and more than 150,000 people voting for their favorites.

From burgers we go to a key component of navigation systems: digital maps. OpenStreetMap (OSM), a well-known and globally crowdsourced project, is dedicated to creating free worldwide maps and has attracted more than 100,000 registered contributors. These people volunteer their services, creating digital maps without being paid; take a glimpse of their work at www.openstreetmap.org.

Why is the amount of data behind OSM constantly growing?
Creating OSM maps is a kind of charity work, open to all to contribute and to use with free licenses. The technology behind it is very user friendly, which will help ensure long-term loyalty among contributors. But probably the most important factor is the fun it brings. Contributing content to this project consists of recording streets, buildings, bridges, forests, point of interests, and other items that you would benefit from having in a map. For many OSM editors, this is their favorite hobby — they are “addicts” in the best sense of the word. They love the project and aspire to create a perfect map. That’s the reason why the growing amount of available map data is of very good quality.

Can automakers and drivers benefit from crowd-sourced map data like OpenStreetMap?
Yes, they can. Because so many people contribute to the project, the amount of data is growing continuously. Every contributor can add or edit content at any time, and changes are integrated into the public OSM database immediately.

In the beginning only streets were collected, but because the data format is extensible, editors can add data like parking spots or pedestrian walkways. For instance, a group of firemen added hydrants for their region to the map material, using OSM’s flexibility to define and add new content. Automakers could take advantage of this flexibility to integrate individual points of interest like car repair shops or to drive business models with third-party partners, such as couponing activities.

Because it’s free of charge, OSM data could, in the mid to long term, develop into a competitive and low-priced alternative to databases being provided by commercial map data suppliers.

For their part, automakers could easily provide toolkits that allow drivers to edit wrong or missing map data on the go. Or even better, allow them to personalize maps with individual content like preferred parking places or favorite burger restaurants.

Are automotive infotainment systems ready for these new kinds of map data?
From a technical point of view, automotive software like the QNX CAR Platform for Infotainment or EB street director navigation can, without modifications, interpret this new kind of data, since the OSM map data can be converted to a specific format, much like commercial map data. It’s like creating your individual burger: the bread and meat remains the same, but you opt for tomatoes instead of onions.

That said, some gaps in the OSM data must be filled before it can provide full-blown automotive navigation. Features like traffic signs, lane information, and turn restrictions are available, but coverage remains limited. Also, the regional coverage varies widely — coverage in Germany, for example, is much higher than in countries in Africa or South America.

From the automaker’s perspective, it could be an interesting challenge to encourage the community to contribute this type of content. One opportunity to support this idea is to develop an OSM-based navigation system for mobile use. After reaching maturity the system could be easily merged into the vehicle and would allow drivers to use premium directions from automotive-approved infotainment systems like EB street director — which we saw at CES in the QNX CAR Platform — for less money.



Daniel Gast has worked for Elektrobit since 2000, initially as software engineer, later as product manager for EB street director navigation. Subsequent to this he took over the responsibility for the business area navigation solutions. He now coordinates innovation management for Elektrobit Automotive. Daniel studied computer science in Erlangen.

Keep up to date with Elektrobit's latest automotive news and products by signing up for the EB Automotive Newsletter — Ed.

Thursday, June 11, 2015

Long time, no see: Catching up with the QNX CAR Platform

By Megan Alink, Director of Marketing Communications for Automotive

It’s a fact — a person simply can’t be in two places at one time. I can’t, you can’t, and the demo team at QNX can’t (especially when they’re brainstorming exciting showcase projects for 2016… but that’s another blog. Note to self.) So what’s a QNX-loving, software-admiring, car aficionado to do when he or she has lost touch and wants to see the latest on the QNX CAR Platform for Infotainment? Video, my friends.

One of the latest additions to our QNX Cam YouTube channel is an update to a video made just over two and a half years ago, in which my colleague, Sheridan Ethier, took viewers on a feature-by-feature walkthrough of the QNX CAR Platform. Now, Sheridan’s back for another tour, so sit back and enjoy a good, old-fashioned catch-up with what’s been going on with our flagship automotive product (with time references, just in case you’re in a bit of a hurry).

Sheridan Ethier hits the road in the QNX reference vehicle based on a modified Jeep Wrangler, running the latest QNX CAR Platform for Infotainment.

We kick things off with a look at one of the most popular elements of an infotainment system — multimedia. Starting around the 01:30 mark, Sheridan shows how the QNX CAR Platform supports a variety of music formats and media sources, from the system’s own multimedia player to a brought-in device. And when your passenger is agitating to switch from the CCR playlist on your MP3 device to Meghan Trainor on her USB music collection, the platform’s fast detection and sync time means you’ll barely miss a head-bob.

The QNX CAR Platform’s native multimedia player — the “juke box” — is just one of many options for enjoying your music.

About five minutes in, we take a look at how the QNX CAR Platform implements voice recognition. Whether you’re seeking out a hot latté, navigating to the nearest airport, or calling a co-worker to say you’ll be a few minutes late, the QNX CAR Platform lets you do what you want to do while doing what you need to do — keeping your hands on the wheel and your eyes on the road. Don’t miss a look at concurrency (previously discussed here by Paul Leroux) during this segment, when Sheridan runs the results of his voice commands (multimedia, navigation, and a hands-free call) smoothly at the same time.

Using voice recognition, users can navigate to a destination by address or point of interest description (such as an airport).

At eight minutes, Sheridan tells us about one of the best examples of the flexibility of the QNX CAR Platform — its support for application environments, including native C/C++, Qt, HTML5, and APK for running Android applications. The platform’s audio management capability makes a cameo appearance when Sheridan switches between the native multimedia player and the Pandora HTML5 app.

Pandora is just one of the HTML5 applications supported by the QNX CAR Platform.

As Sheridan tells us (at approximately 12:00), the ability to project smartphone screens and applications into the vehicle is an important trend in automotive. With technologies like MirrorLink, users can access nearly all of the applications available on their smartphone right from the head unit.

Projection technologies like MirrorLink allow automakers to select which applications will be delivered to the vehicle’s head unit from the user’s connected smartphone. 

Finally, we take a look at two interesting features that differentiate the QNX CAR Platform — last mode persistence (e.g. when the song you were listening to when you turned the car off starts up at the same point when you turn the car back on) and fastboot (which, in the case of QNX CAR, can bring your backup camera to life in 0.8 seconds, far less than the NHTSA-mandated 2 seconds). These features work hand-in-hand to ensure a safer, more enjoyable, more responsive driving experience.

Fastboot in 0.8 seconds means that when you’re ready to reverse, your car is ready to show you the way.

Interested in learning more about the QNX CAR Platform for Infotainment? Check out Paul Leroux’s blog on the architecture of this sophisticated piece of software. To see QNX CAR in action, read Tina Jeffrey’s blog, in which she talks about how the platform was implemented in the reimagined QNX reference vehicle for CES 2015.

Check out the video here:


Wednesday, June 10, 2015

CES press: This just in!

OK well, maybe not 'just in' but these articles and releases were posted yesterday and are quite exciting if you are following QNX technology:

TeleCommunication Systems Supplies Advanced Navigation for QNX CAR 2 Application Platform

TI Takes The Driver’s Seat With An Unrivaled, Full System Solution For Connected Automotive Infotainment



Why I should have gone to CES this year

No problem, I said, I'll be happy to stay back at the office. After all, somebody has to hold down the fort while everyone is at CES, and it may as well be me.

Of course, I didn't know what Audi was bringing to the show. Because if I did, I wouldn't have been so willing to take one for the team. If you're wondering what I am talking about, it's the new user-programmable instrument cluster for the upcoming 2015 Audi TT. It's based on the QNX CAR Platform for Infotainment, and it's about the coolest thing I've seen in a car, ever — even if I haven't yet had a chance to see it in person.

Roll the tape...





Why doesn’t my navigation system understand me?

A story where big is good, but small is even better.

Yoshiki Chubachi
Yoshiki Chubachi
My wife and I are about to go shopping in a nearby town. So I get into my car, turn the key, and set the destination from POIs on the navigation system. The route calculation starts and gives me today’s route. But somehow, I feel a sense of doubt every time this route comes up on the system...

Route calculation in navigation uses Dijkstra's algorithm, invented by Edsger Dijkstra in 1956 to determine the shortest path in a graph. To save calculation time, navigation systems use two directional searches: one as the starting point and the other as the destination point. The data scheme that navigation systems use to represent maps consists of nodes, links, and attributes. Typically, a node represents a street intersection; a link represents the stretch of road, or connection, between two nodes; and attributes consist of properties such as street name, street addresses, and speed limit (see diagram).

Features of a map database. From Wikipedia.
As you may guess, it can take a long time to calculate the shortest path from all of the routes available. The problem is, automakers typically impose stringent requirements on timing. For example, I know of an automaker that expected the route from Hokkaido (in northern Japan) to Kyushu (in southern Japan) to be calculated in just a few seconds.

To address this issue, a system can use a variety of approaches. For instance, it can store map data hierarchically, where the highest class consists of major highways. To choose a route between two points, the system follows the hierarchical order, from high to low. Another approach is to use precalculated data, prepared by the navigation supplier. These examples offer only a glimpse of the complexity and magnitude of the problems faced by navigation system vendors.

An encouraging trend
Big data is the hot topic in the navigation world. One source of this data is mobile phones, which provide floating car data (current speed, current location, travel direction, etc.) that can be used by digital instrument clusters and other telematics components. A system that could benefit from such data is VICS (Vehicle Information and Communication System), a traffic-information standard used in Japan and supported by Japanese navigation systems. Currently, VICS broadcasts information updates only every 5 minutes because of the bandwidth limitations of the FM sub-band that it uses. As a result, a navigation system will sometimes indicate that no traffic jam exists, even though digital traffic signs indicate that a jam does indeed exist and that service is limited to the main road. This delay, and the inconvenience it causes, could be addressed with floating car data.


An example of a VICS-enabled system in which traffic congestion, alternate routes, and other information is overlaid on the navigation map. Source: VICS

During the great earthquake disaster in East Japan, Google and automotive OEMs (Honda, Nissan, Toyota) collaborated by using floating car data to provide road availability — a clear demonstration of how can big data can enhance car navigation. Leveraging big data to improve route calculation is an encouraging trend.

Small data: making it personal
Still, a lot can be accomplished with small data; specifically, personalization. I may prefer one route on the weekend, but another route on a rainy day, and yet another route on my wife's birthday. To some extent, a self-learning system could realize this personalization by gauging how frequently I've used a route in the past. But I don’t think that's enough. As of now, I feel that my navigation system doesn't understand me as well as Amazon, which at least seems to know which book I’d like to read! Navigation systems need to learn more about who I am, how well I can drive, and what I like.

Personalization resides on the far side of big data but offers more convenience to the driver. The more a navigation system can learn more about a driver (as in “Oh, this guy has limited driving skills and doesn’t like narrow roads”), the better. It is best to store this data on a server; that way, the driver could benefit even if he or she switches to a different car or navigation system. This can be done using the latest web technologies and machine learning. Currently, navigation systems employ a rule-based algorithm, but it would be interesting to investigate probability-based approaches, such as Bayesian networks.

I’m looking forward to the day when my navigation system can provide a route that suits my personal tastes, skills, and habits. Navigation suppliers may be experiencing threats from the mobile world, including Google and Apple, but I think that returning to the original point of navigation — customer satisfaction — can be achieved by experienced navigation developers.

Yoshiki Chubachi is the automotive business development manager for QNX Software Systems in Japan

Monday, June 8, 2015

It's a Bentley! A guided tour of the new QNX technology concept car

"Bend it, shape it, any way you want it"
— Headline from a QNX advertisement, circa 1987

I’m about to show you some pictures of a car. Not just any car, but a powerful, luxurious, and stunningly beautiful car. A car that has undergone a technological transformation.

If you’re like me, you'll be fascinated by the car’s features, some of which have never been seen in a vehicle — until now. But if you can, remember that it isn’t just about the cool features. It’s also about the platform that enabled them.

I’m speaking, of course, of the QNX CAR application platform.

We created the new QNX technology concept car — a modified Bentley Continental GT — to demonstrate that flexibility and customization form the very DNA of the QNX CAR platform. If you’ve seen the QNX reference vehicle, you already know that the platform provides an extremely rich environment for in-car infotainment, complete with HMI frameworks, smartphone integration, an HTML5 engine, a mobile device gateway, and a host of pre-integrated partner technologies — everything to kickstart our customers' projects. But in the automotive world, differentiation is everything. So it’s just as important that the platform enables customers to add their own branding, features, and sizzle. And to do it quickly.

Ease of branding and
personalization is just one
of the capabilities of the
QNX CAR platform.
Which is where the new concept car comes in. To create it, we used the same base QNX CAR platform that we offer our customers. But when you compare the Bentley to the Jeep, which uses a stock version of QNX CAR, the differences are dramatic: different features, different branding, and a different look-and-feel. In effect, the Jeep shows what QNX CAR can do out of the box, while the Bentley shows what QNX CAR lets you do once you start bending it to your imagination. One platform, many possibilities.

Which brings me to the slogan at the top of this post. It's amazing to think that a core value of QNX technology in the 1980s — giving customers the flexibility to achieve what they want to do — remains just as true today. Some values, it seems, are worth keeping.

And now, the car…
I know that you’re anxious to peek inside the car and see what we’ve done. But before we go any further, take a moment to savor the car’s beautifully sculpted exterior. This is one classy set of wheels. In fact, if you ask me, the wheels alone are worth the price of admission:



The awesome (and full HD) center stack
Okay, time to hop in — but get ready to prop up your jaw. Because the first thing you’ll notice is the jaw-droppingly beautiful center stack. This immense touchscreen features a gracefully curved surface, full HD graphics, and TI’s optical touch input technology, which allows a physical control knob to be mounted directly on the screen — a feature that’s cool and useful. (In the photo below, the clock display appears within the knob.)

The center stack supports a host of applications, including a 3D navigation system from Elektrobit that makes full use of the display. Just check out this bird’s-eye view of the Las Vegas Strip:



So how big is the display? Big enough to provide access to other functions, such as the car’s media player or virtual mechanic, and still have plenty room for navigation. Check it out:



The awesome (and very polite) voice rec system
Time to talk to the car. Just say “Hello Bentley,” and the car’s voice recognition system immediately comes to life and begins to interact with you — in a British accent, no less. You can now tell the media player what you’d like to hear and the navigation system where you’d like to go.

To provide natural language speech recognition, the system uses the cloud-based AT&T Watson speech engine, as well as an “intent framework” from QNX. It also uses keyword spotting technology from Sensory so you can start the system simply by talking to it.

The awesome (and nicely integrated) smartphone support
The Bentley also showcases how the QNX CAR platform enables automakers to offer advanced integration with popular smartphones. For instance, the car can communicate with a smartphone to stream music, or to provide notifications of incoming email, news feeds, and other real-time information — all displayed in a manner appropriate to the automotive context. Here's an example:



The awesome (and just plain fun) web app
I know, I know: the car looks cool, but you’re not at CES this week to see it first-hand. But how about the next best thing? Just connect to the web app and keep tabs on the Bentley in real time. (Note: The car will go online later this morning.) The app lets you view a variety of data that the car publishes to the cloud, such as what song the infotainment system is playing and whether someone has just opened a door. It also displays information that would be extremely helpful if this were your personal car, such as fluid levels and tire pressure. (This is a preliminary screen for the app, so I'm not sure if the tire pressures are realistic.)



UPDATE: The web app is now live, and the desktop version features a live camera feed of the Bentley and Jeep. Check it out!



The awesome (and very configurable) digital instrument cluster
The instrument cluster is implemented entirely in software, though you would hardly know it — the virtual gauges are impressively realistic. But more impressive still is the cluster’s ability to morph itself on the fly. Put the car in Drive, and the cluster will display a tach, gas gauge, temperature gauge, and turn-by-turn directions — the cluster pulls these directions from the center stack’s navigation system (cool, that). Put the car in Reverse, and the cluster will display a video feed from the car’s backup camera.



There are other options as well. For instance, the cluster can display information from the media player or display the current weather:



The awesome everything else
I’ve only scratched the surface of what the car can do. For instance, it also provides:
  • Advanced multimedia system — Offers direct support for Pandora radio and the first embedded in-car implementation of the Shazam music discovery service.
     
  • Video conferencing with realistic telepresence — Separate cameras for the driver and passenger provide independent video streams, while high-definition voice technology from QNX offers expanded bandwidth for greater realism, as well as stereo telepresence for making the remote caller sound as if they’re sitting right next to you.
     
  • LTE connectivity — The car features an LTE radio modem, as well as a Wi-Fi hotspot for devices you bring into the car.

Super size those images
Want to see the center stack and instrument cluster in all their high-resolution glory? Just check out our QNX Flickr account.

That's all for now, but stay tuned: We’ll have plenty more news for you today and all through this week.