Tuesday, June 30, 2015

Speech interfaces: UI revolution or intelligent evolution?

Speech interfaces have received a lot of attention recently, especially with the marketing blitz for Siri, the new speech interface for the iPhone.

After watching some of the TV commercials you might conclude that you can simply talk to your phone as if it were your friend, and it will figure out what you want. For example, in one scenario the actor asks the phone, “Do I need a raincoat?”, and the phone responds with weather information.

A colleague commented that if he wanted weather information he would just ask for it. As in “What is the weather going to be like in Seattle?” or “Is it going to rain in Seattle?”.

Without more conversational context, if a friend were to ask me, “Do I need a raincoat?”, I would probably respond, “I don’t know, do you?” — jokingly, of course.

Evo or revo?
Are we ready to converse
with our phones and cars?
Kidding aside, systems like Siri raise an important question: Are we about to see a paradigm shift in user interfaces?

Possibly. But I think it will be more of a UI evolution than a UI revolution. In other words, speech interfaces will play a bigger role in UI designs, but that doesn't mean you're about to start talking to your phone — or any other device — as if it’s your best friend.

Currently, speech interfaces are underutilized. The reasons for this aren't yet clear, though they seem to encompass both technical and user issues. Traditionally, speech recognition accuracy rates have been less than perfect. Poor user interface design (for instance, reprompting strategies) has contributed to the overall problem and to increased user frustration.

Also, people simply aren't used to speech interfaces. For example, many phones support voice-dialing, yet most people don't use this feature. And user interface designers seem reluctant to leverage speech interfaces, possibly because of the additional cost and complexity, lack of awareness, or some other reason.


"Relying heavily on speech can lead
to a suboptimal user experience..."

As a further complication, relying heavily on speech as an interface can lead to a suboptimal user experience. Speech interfaces pose some real challenges, including recognition accuracy rates, natural language understanding, error recovery dialogs, UI design, and testing. They aren't the flawless wonders that some marketers would lead you to believe.

Still, I believe there is a happy medium for leveraging speech interfaces as part of a multi-modal interface — one that uses speech as an interface where it makes sense. Some tasks are better suited for a speech interface, while others are not. For example, speech provides an ideal way to provide input to an application when you can capitalize on information stored in the user’s head. But it’s much less successful when dealing with large lists of unfamiliar items.

Talkin' to your ride
Other factors, besides Apple, are driving the growing role of speech interfaces — particularly in automotive. Speech interfaces can, for example, help address the issue of driver distraction. They allow drivers to keep their “eyes on the road and hands on the wheel,” to quote an oft-used phrase.

So, will we see a paradigm shift towards speech interfaces? It's unlikely. I'm hoping, though, that we'll see a UI evolution that makes better use of them.

Think of it more as a paradigm nudge than a paradigm shift.


Recommended reading

Situation Awareness: a Holistic Approach to the Driver Distraction Problem
Wideband Speech Communications for Automotive: the Good, the Bad, and the Ugly

 

AUTOMOBILE Pimp your ride with augmented reality — Part II

Last week, I introduced you to some cool examples of augmented reality, or AR, and stated that AR can help drivers deal with the burgeoning amount of information in the car.

Now that we’ve covered the basics, let’s look at some use-cases for both drivers and passengers. Remember, though, that these examples are just a taste — the possibilities for integrating AR into the car are virtually endless.



AR for the driver
When it comes to drivers, AR will focus on providing information while reducing distraction. Already, some vehicles use AR to overlay the vehicle trajectory onto a backup camera display, allowing the driver to gauge where the car is headed. Some luxury cars go one step further and overlay lane markings or hazards in the vehicle display.

Expect even more functionality in the future. In the case of a backup camera, the display might take advantage of 3D technology, allowing you to see, for example, that a skateboard is closer than the post you are backing towards. And then there is GM's prototype heads-up system, which, in dark or foggy conditions, can project lane edges onto the windshield or highlight people crossing the road up ahead:



AR can be extremely powerful while keeping distraction to a minimum. Take destination search, for example. You could issue the verbal command, “Take me to a Starbucks on my route. I want to see their cool AR cups”. The nav system could then overlay a subtle route guidance over the road with a small Starbucks logo that gets bigger as you approach your destination. The logo could then hover over the building when you arrive.

You'll no longer have to wonder if your destination is on the right or left, or if your nav system is correct when it says, “You have arrived at your destination.” The answer will be right in front of you.

AR for the passenger
So what about the passenger? Well, you could easily apply AR to side windows and allow passengers to learn more about the world around them, a la Wikitude. Take, for example, this recent video from Toyota, which represents one of the best examples of how AR could make long road trips less tedious and more enjoyable:


QNX acoustics technology shortlisted for 2015 embedded AWARD

Okay, first things first. I didn't get the capitalization wrong. The name of the award really is spelled that way. I thought it odd at first, but I'm getting used to it. And besides, who am I to complain? After all, I spend a good part of my life promoting a product whose name is spelled all uppercase, and... where was I? Oh yes, the award!

Every year, the folks who organize the embedded world Exhibition&Conference hold the embedded AWARDs, which honor the most innovative software, hardware, and tools for embedded developers. And this year, the competition judges selected QNX Acoustics for Active Noise Control as a finalist in the software category.

If you aren’t familiar with our ANC solution, allow me to provide an overview — which will also help explain why the embedded AWARD judges are so impressed.

Automakers need to reduce fuel consumption. And to do that, they employ techniques such as variable engine displacement and operating the engine at lower RPM. These techniques may save gas, but they also result in "boom" noise that permeates the car's interior and can lead to driver distraction. And who needs more distraction?

QNX Acoustics for Active Noise Control can integrate 
seamlessly into a vehicle's infotainment system.
To reduce this noise, automakers use ANC, which plays “anti-noise” (sound proportional but inverted to the offending engine tones) over the car's speakers. The problem is, existing ANC systems require dedicated hardware, which adds design complexity, not to mention significant Bill of Materials costs. And who needs more costs?

Enter QNX Acoustics for ANC. Rather than use dedicated hardware, QNX ANC provides a software library that can run on the existing DSP or CPU of the car's head unit or audio system. This approach not only reduces hardware costs, also enables better performance, faster development, and more design flexibility. I could go on, but I will let my colleague Tina Jeffrey provide the full skinny.

Did I mention? This wouldn’t be the first time QNX Software Systems is tapped for an embedded AWARD. It has won two so far, in 2006 and 2004, for innovations in multi-core and power-management technology. It was also a finalist in 2010, for its persistent publish/subscribe messaging. Here's to making it a hat trick.

DevCon5 recap: building apps for cars

Tina Jeffrey
Last week I had the pleasure of presenting at the DevCon5 HTML5 & Mobile App Developers Conference, held at New York University in the heart of NYC. The conference was abuzz with the latest and greatest web technologies for a variety of markets, including gaming, TV, enterprise, mobile, retail, and automotive.

The recurring theme throughout the event was that HTML5 is mainstream. Even though HTML5 still requires some ripening as a technology, it is definitely the burgeoning choice for app developers who wish to get their apps onto as many platforms as possible, quickly and cost effectively. And when a developer is confronted with a situation where HTML5 falls short (perhaps a feature that isn’t yet available), then hybrid is always an option. At the end of the day, user experience is king, and developers need to design and ship apps that offer a great experience and keep users engaged, regardless of the technology used.

Mainstream mobile device platforms all have web browsers to support HTML5, CSS3, and JavaScript. And there’s definitely no shortage of mobile web development frameworks to build consumer and enterprise apps that look and perform like native programs. Many of these frameworks were discussed at the conference, including jQuery Mobile, Dojo Mobile, Sencha Touch, and Angular JS. Terry Ryan of Adobe walked through building a PhoneGap app and discussed how the PhoneGap Build tool lets programmers upload their code to a cloud compiler and automatically generate apps for every supported platform — very cool.

My colleague Rich Balsewich, senior enterprise developer at BlackBerry, hit a homerun with his presentation on the multiple paths to building apps. He walked us through developing an HTML5 app from end to end, and covered future features and platforms, including the automobile. A special shout-out to Rich for plugging my session “The Power of HTML5 in the Automobile” held later that afternoon.

My talk provided app developers with some insight into creating apps for the car, and discussed the success factors that will enable automakers to leverage mobile development — key to achieving a rich, personalized, connected user experience. Let me summarize with the salient points:

What’s needed

What we're doing about it

The automotive community wants apps, and HTML5 provides a common app platform for infotainment systems. We’ve implemented an HTML5 application framework in the QNX CAR Platform for Infotainment.
Automotive companies must leverage the broad mobile developer ecosystem to bring differentiated automotive apps and services to the car. We’re helping by getting the word out and by building a cloud-based app repository that will enable qualified app partners to get their apps in front of automotive companies. We plan to roll out this repository with the release of the QNX CAR Platform 2.1 in the fall.
The developer community needs standardized automotive APIs. We’re co-chairing the W3C Automotive and Web Platform Business Group, which has a mandate to create a draft specification of a vehicle data API. We’re also designing the QNX CAR Platform APIs to be Apache Cordova-compliant.
Automotive platform vendors must supply tools that enable app developers to build and test their apps. We plan to release the QNX CAR Platform 2.1 with open, accessible tooling to make it easy for developers to test their apps in a software-only environment.

The CLA 45 has landed!

Megan Alink
Europe, your day has come! After five years of showcasing our technology concept cars primarily in North America, we’ve bid farewell to the Mercedes CLA 45 and sent it across the pond to our colleagues in Germany. Over the coming year while the Mercedes resides in Europe, our customers — and anyone who’s just mesmerized by slick, pre-integrated automotive tech — will have a chance to check the car out at a number of public events. (Stay tuned to www.qnx.com for more details as these events arise.)

Witness the unboxing:

The CLA 45 emerges into the light at Bremerhaven.

On land and settling in nicely.

So beautiful! We can't wait for a whole new continent to see it for themselves.

Interested in a sneak peek at the inside of this gorgeous vehicle? Read this blog from Lynn Gayowski, or get up close and personal with the digital instrument cluster in this one from Paul Leroux. For more photos, see our Flickr album.

A matter of urgency: preparing for ISO 26262 certification

Yoshiki Chubachi
Yoshiki Chubachi
Guest post by Yoshiki Chubachi, automotive business development manager for QNX Software Systems, Japan

Two weeks ago in Tokyo, QNX Software Systems sponsored an ISO 26262 seminar hosted by IT Media MONOist, a Japanese information portal for engineers. This was the fourth MONOist seminar to focus on the ISO 26262 functional safety standard, and the theme of the event conveyed an unmistakable sense of urgency: “You can’t to afford to wait any longer: how you should prepare for ISO 26262 certification”.

In his opening remarks, Mr. Pak, a representative of MONOist, noted that the number of attendees for this event increases every year. And, as the theme suggests, many engineers in the automotive community feel a strong need to get ready for ISO26262. In fact, registration filled up just three days after the event was announced.

The event opened with a keynote speech by Mr. Koyata of the Japan Automobile Research Institute (JARI), who spoke on functional safety as a core competency for engineers. A former engineer at Panasonic, Mr. Koyata now works as an ISO 26262 consultant at JARI. In his speech, he argued that every automotive developer should embrace knowledge of ISO 26262 and that automakers and Tier 1 suppliers should adopt a functional "safety culture." Interestingly, his argument aligns with what Chris Hobbs and Yi Zheng of QNX advocate in their paper, “10 truths about building safe embedded software systems.” My Koyata also discussed the difference between safety and ‘Hinshitu (Quality)” which is a strong point of Japan industry.

Next up were presentations by the co-sponsor DNV Business Assurance Japan. The talks focused on safety concepts and architecture as well as on metrics for hardware safety design for ISO 26262.

I had the opportunity to present on software architecture and functional safety, describing how the QNX microkernel architecture can provide an ideal system foundation for automotive systems with functional safety requirements. I spoke to a number of attendees after the seminar, and they all recognized the need to build an ISO 26262 process, but didn’t know how to start. The need, and opportunity, for education is great.

Yoshiki presenting at the MONOist ISO 26262 seminar. Source: MONOist

The event ended with a speech by Mr. Shiraishi of Keio University. He has worked on space satellite systems and offered some interesting comparisons between the functional safety of space satellites and automotive systems.

Safety and reliability go hand in hand. “Made in Japan” is a brand widely known for its reliability. Although Japan is somewhat behind when it comes to awareness for ISO 26262 certification, I see a great potential for it to be the leader in automotive safety. Japanese engineers take pride in the reliability of products they build, and this mindset can be extended to the new generation of functional safety systems in automotive.


Additional reading

QNX Unveils New OS for Automotive Safety
Architectures for ISO 26262 systems with multiple ASIL requirements (whitepaper)
Protecting Software Components from Interference in an ISO 26262 System (whitepaper)
Ten Truths about Building Safe Embedded Software Systems (whitepaper)

Bad idea, good idea

Why equip cars with external-sounding speakers? I thought you'd never ask. As it turns out, it can be a really bad idea. Or a really good one.

Here, for example, is a case where bad arguably prevails:


Source: Modern Mechanix blog

No doubt, the person who devised this system in 1931 thought it a brilliant, or at least entertaining, idea. Fortunately, common sense prevailed and the era of the "auto speaker," with its potential to scare the living daylights out of pedestrians, never came to pass.

But here's the thing: equipping cars with external-sounding speakers can be a great idea, when done for the right reasons. For example, some hybrid and electric vehicles are dangerously quiet for bicyclists and visually impaired pedestrians. Adding speakers to emit audible alerts or to project synthesized engine sounds can be just what the doctor ordered. Or rather, what the parliament ordered: earlier this month, members of the European Parliament stated that they want automakers to install acoustic alerting systems in hybrid vehicles by July 2019.

Mind you, safety isn't the only reason to project synthesized engine sounds. For example, fuel-saving techniques can make even powerful engines sound wimpy — a problem when high performance is a key ingredient of a car's branding. In that case, the automaker may wish to project synthesized engine sounds over both external and internal speakers. The speakers can help preserve the car's wow factor (provided they're not too loud) and the internal speakers, in particular, can make it easier for car owners who drive manual to shift gears by ear. The QNX concept car for acoustics offers a good example of this technology in action.

All of which to say, engine sound enhancement, also known as ESE, is here to stay. And it's not a bad time to be in the automotive-speaker business, either.

My top moments of 2015 — so far

Paul Leroux
Yes, I know, 2015 isn’t over yet. But it’s been such a milestone year for our automotive business that I can’t wait another two months to talk about it. And besides, you’ll be busy as an elf at the end of December, visiting family and friends, skiing the Rockies, or buying exercise equipment to compensate for all those holiday carbs. Which means if I wait, you’ll never get to read this. So let’s get started.


We unveil a totally new (and totally cool) technology concept car
Times Square. We were there.
It all began at 2015 CES, when we took the wraps off the latest QNX technology concept car — a one-of-a-kind Bentley Continental GT. The QNX concept team outfitted the Bentley with an array of technologies, including a high-definition DLP display, a 3D rear-view camera, cloud-based voice recognition, smartphone connectivity, and… oh heck, just read the blog post to get the full skinny.

Even if you weren’t at CES, you could still see the car in action. Brian Cooley of CNET, Michael Guillory of Texas Instruments, the folks at Elektrobit, and Discovery Canada’s Daily Planet were just some of the individuals and organizations who posted videos. You could also connect to the car through a nifty web app. Heck, you could even see the Bentley’s dash on the big screen in Times Square, thanks to the promotional efforts of Elektrobit, who also created the 3D navigation software for the concept car.

We ship the platform
We wanted to drive into CES with all cylinders firing, so we also released version 2.0 of the QNX CAR Platform for Infotainment. In fact, several customers in the U.S., Germany, Japan, and China had already started to use the platform, through participation in an early access program. Which brings me to the next milestone...

Delphi boards the platform
The first of many.
Also at CES, Delphi, a global automotive supplier and long-time QNX customer, announced that version 2.0 of the QNX CAR Platform will form the basis of its next-generation infotainment systems. As it turned out, this was just one of several QNX CAR customer announcements in 2015 — but I’m getting ahead of myself.

We have the good fortune to be featured in Fortune
Fast forward to April, when Fortune magazine took a look at how QNX Software Systems evolved from its roots in the early 1980s to become a major automotive player. Bad news: you need a subscription to read the article on the Fortune website. Good news: you can read the same article for free on CNN Money. ;-)

A music platform sets the tone for our platform
In April, 7digital, a digital music provider, announced that it will integrate its 23+ million track catalogue with the QNX CAR Platform. It didn't take long for several other partners to announce their platform support. These include Renesas (R-Car system-on-chip for high-performance infotainment), AutoNavi (mobile navigation technology for the Chinese market), Kotei (navigation engine for the Japanese market), and Digia (Qt application framework).

We stay focused on distraction
Back in early 2015, Scott Pennock of QNX was selected to chair an ITU-T focus group on driver distraction. The group’s objective was serious and its work was complex, but its ultimate goal was simple: to help reduce collisions. This year, the group wrapped up its work and published several reports — but really, this is only the beginning of QNX and ITU-T efforts in this area.

We help develop a new standard
Goodbye fragmentation; hello
standard APIs.
Industry fragmentation sucks. It means everyone is busy reinventing the wheel when they could be inventing something new instead. So I was delighted to see my colleague Andy Gryc become co-chair of the W3C Automotive and Web Platform Business Group, which has the mandate to accelerate the adoption of web technologies in the car. Currently, the group is working to draft a standard set of JavaScript APIs for accessing vehicle data information. Fragmentation, thy days are numbered.

We launch an auto safety program
A two-handed approach to
helping ADAS developers.
On the one hand, we have a 30-year history in safety-critical systems and proven competency in safety certifications. On the other hand, we have deep experience in automotive software design. So why not join both hands together and allow auto companies to leverage our full expertise when they are building digital instrument clusters, advanced driver assistance systems (ADAS), and other in-car systems with safety requirements?

That’s the question we asked ourselves, and the answer was the new QNX Automotive Safety Program for ISO 26262. The program quickly drew support from several industry players, including Elektrobit, Freescale, NVIDIA, and Texas Instruments.

We jive up the Jeep
A tasty mix of HTML5 & Android
apps, served on a Qt interface,
with OpenGL ES on the side.
If you don’t already know, we use a Jeep Wrangler as our reference vehicle — basically, a demo vehicle outfitted with a stock version of the QNX CAR Platform. This summer, we got to trick out the Jeep with a new, upcoming version of the platform, which adds support for Android apps and for user interfaces based on the Qt 5 framework.

Did I mention? The platform runs Android apps in a separate application container, much like it handles HTML5 apps. This sandboxed approach keeps the app environment cleanly partitioned from the UI, protecting both the UI and the overall system from unpredictable web content. Good, that.

The commonwealth’s leader honors our leader
I only ate one piece. Honest.
Okay, this one has nothing to do with automotive, but I couldn’t resist. Dan Dodge, our CEO and co-founder, received a Queen Elizabeth II Diamond Jubilee Medal in recognition of his many achievements and contributions to Canadian society. To celebrate, we gave Dan a surprise party, complete with the obligatory cake. (In case you’re wondering, the cake was yummy. But any rumors suggesting that I went back for a second, third, and fourth piece are total fabrications. Honestly, the stories people cook up.)

Mind you, Dan wasn’t the only one to garner praise. Sheridan Ethier, the manager of the QNX CAR development team, was also honored — not by the queen, but by the Ottawa Business Journal for his technical achievements, business leadership, and community involvement.

Chevy MyLink drives home with first prize — twice
There's nothing better than going home with first prize. Except, perhaps, doing it twice. In January, the QNX-based Chevy MyLink system earned a Best of CES 2015 Award, in the car tech category. And in May, it pulled another coup: first place in the "Automotive, LBS, Navigation & Safe Driving" category of the 2015 CTIA Emerging Technology (E-Tech) Awards.

Panasonic, Garmin, and Foryou get with the platform
Garmin K2 platform: because
one great platform deserves
another.
August was crazy busy — and crazy good. Within the space of two weeks, three big names in the global auto industry revealed that they’re using the QNX CAR Platform for their next-gen systems. Up first was Panasonic, who will use the platform to build systems for automakers in North America, Europe, and Japan. Next was Foryou, who will create infotainment systems for automakers in China. And last was Garmin, who are using the platform in the new Garmin K2, the company’s infotainment solution for automotive OEMs.

And if all that wasn’t cool enough…

Mercedes-Benz showcases the platform
Did I mention I want one?
When Mercedes-Benz decides to wow the crowds at the Frankfurt Motor Show, it doesn’t settle for second best. Which is why, in my not so humble opinion, they chose the QNX CAR Platform for the oh-so-desirable Mercedes-Benz Concept S-Class Coupé.

Mind you, this isn’t the first time QNX and Mercedes-Benz have joined forces. In fact, the QNX auto team and Mercedes-Benz Research & Development North America have collaborated since the early 2000s. Moreover, QNX has supplied the OS for a variety of Mercedes infotainment systems. The infotainment system and digital cluster in the Concept S-Class Coupé are the latest — and arguably coolest — products of this long collaboration.

We create noise to eliminate noise
Taking a sound approach to
creating a quieter ride.
Confused yet? Don’t be. You see, it’s quite simple. Automakers today are using techniques like variable cylinder management, which cut fuel consumption (good), but also increase engine noise (bad). Until now, car companies have been using active noise control systems, which play “anti-noise” to cancel out the unwanted engine sounds. All fine and good, but these systems require dedicated hardware — and that makes them expensive. So we devised a software product, QNX Acoustics for Active Noise Control, that not only out-performs conventional solutions, but can run on the car’s existing audio or infotainment hardware. Goodbye dedicated hardware, hello cost savings.

And we flub our lines on occasion
Our HTML5 video series has given companies like Audi, OnStar, Gartner, TCS, and Pandora a public forum to discuss why HTML5 and other open standards are key to the future of the connected car. The videos are filled with erudite conversation, but every now and then, it becomes obvious that sounding smart in front of a camera is a little harder than it looks. So what did we do with the embarrassing bits? Create a blooper reel, of course.

Are these bloopers our greatest moments? Nope. Are they among the funniest? Oh yeah. :-)

Monday, June 29, 2015

QNX-based nav system helps Ford SUVs stay on course down under

Paul Leroux
This just in: SWSA, a leading electronics supplier to the Australian automotive industry, and NNG, the developer of the award-winning iGO navigation software, have created a QNX-based navigation system for Ford Australia. The new system has been deployed in Ford Territory SUVs since June of this year.

To reduce driver distraction, the system offers a simplified user interface and feature set. And, to provide accurate route guidance, the system uses data from an internal gyroscope and an external traffic message channel, as well as standard GPS signals. Taking the conditions of local roads into account, the software provides a variety of alerts and speed-camera warnings; it also offers route guidance in Australian English.

The navigation system is based on the iGO My way Engine, which runs in millions of navigation devices worldwide. To read NNG's press release, click here.


SWSA's new nav system for the Ford Territory is based on the Freescale
i.MX31L processor, QNX Neutrino RTOS, and iGO My way Engine.

 

QNX-powered 2015 Audi TT named best-connected car

Is it innovative, beautiful, versatile, or just plain cool? I haven’t quite decided, so I’m thinking it’s all of the above. The QNX-based virtual cockpit in the 2015 Audi TT is a ravishing piece of automotive technology, and it brings driver convenience to a new level by integrating everything from speed and navigation to music and handsfree calling — all in a single, user-configurable display.

It seems I’m not the only one who's impressed. Because last week, 42,500 readers of “auto motor und sport” and “CHIP” chose the Audi TT as the industry's best-connected car. In fact, Audi took top honors in several categories, including navigation, telephone integration, sound system, entertainment/multimedia, and connected car.

To get an idea of what all the fuss is about, check out our video of the Audi TT’s virtual cockpit in action. We filmed this at CES earlier this year:



For more information on the award and the Audi TT, read Audi's press release.

Meet the QNX concept team: Jonathan Hacker, software developer

Jonathan Hacker
Last week, we treated you to an interview with Mark Rigley, the concept development team’s director. This week, we meet up with someone who has worked on several of the team’s projects, including the Porsche 911 and Jeep Wrangler. His name is Jonathan Hacker — a wonderful aptronym, if ever there was one.

So tell us, Jon, what do you do on the concept team?
I’m a software developer. I spend much of my time listening to people so I can understand what, exactly, we want to accomplish in a concept system. I then figure out how we can use software to achieve our goal. I also spend quite a bit of my time coding.

What do you like best about being on the concept team?
I like taking a big problem, coming up with a crazy solution that no one had thought of, and turning it into something real.

Has there been a standout moment for you while working on the team?
Yes, when we were trying to get the digital speedometer to work on the Porsche 911. We drove dozens of laps around QNX headquarters while I sat in the passenger seat with my laptop, taking readings off the Porsche’s CAN bus. It was a blast — especially since we got the speedometer to work!

What is your biggest challenge right now? What keeps you up at night?

Working on concept projects is a juggling act. There are always many little pieces of software and hardware drivers being developed at the same time, and everything has to come together seamlessly. I’ve always been more of a programmer than a project manager, so making sure everything stays on track keeps me on my toes.

Who would you like to see seated in a QNX technology concept car or reference vehicle?
This couldn’t happen in real life because he’s a fictional character, but in almost every mockup produced by our designers, Gordon Freeman is phoning the car — you know, the protagonist in the Half-Life video game series. So it would be awesome to see Gordon Freeman sitting in the car. But unless it’s someone in a costume, that’s not going to happen!

What is your dream car?
The Porsche ruined most other cars for me; it really is that amazing. But If I had to pick one, it would be the Audi R8. It’s a fantastic looking car.

Are you excited about the new concept car that we plan to unveil at CES?

Of course — it’s going to rock! We are building some really awesome stuff into this car. People will be impressed.

AUTOMOBILE We showed you so

QNX has been building NFC functionality into concept cars since 2015. Now, with the advent of automotive-grade tags and chips, NFC may be coming to a dashboard near you.

Paul Leroux
Why does QNX transform vehicles like the Maserati QuattroPorte GTS, Mercedes-Benz CLA45, and Bentley Continental into technology concept cars? I can think of many reasons, but three stand out. First, the cars allow us to demonstrate the inherent flexibility and customizability of QNX technology. If you could put all of the cars side by side, you would quickly see that, while they all use the same QNX platform, each has a unique feature set and a distinctive look-and-feel — no two are alike. This flexibility is of immense importance to automakers, who, for reasons of market differentiation, need to deliver a unique brand experience in each marque or vehicle line. Alf Pollex, Head of Connected Car and Infotainment at Volkswagen, says it best: “the QNX platform... enables us to offer a full range of infotainment systems, from premium level to mass volume, using a single, proven software base.”

Second, the cars explore how thoughtful integration of new technologies can make driving easier, more enjoyable, and perhaps even a little safer. Case in point: the Maserati’s obstacle awareness display, which demonstrates how ADAS systems can aggregate data from ultrasonic and LiDAR sensors to help drivers become more aware of their surroundings. This display works much like a heads-up display, but instead of providing speed, RPM, or navigation information, it offers visual cues that help the driver gauge the direction and proximity of objects around the vehicle — pedestrians, for example.

Look ma, no menus: At 2015 CES, a QNX concept car
showcased how NFC can enable single-tap Bluetooth
phone pairing.
Source CrackBerry.com
Third, the cars explore solutions that address real and immediate pain points. Take, for example, the pairing of Bluetooth phones. Many consumers find this task difficult and time-consuming; automakers, for their part, see it as a source of customer dissatisfaction. So, in 2015, we started to equip some of our concept cars with near field communication (NFC) technology that enables one-touch phone pairing. This pairing is as easy it sounds: you simply touch an NFC-enabled phone to an NFC tag embedded in the car’s console, and voilà, pairing with the car’s infotainment system happens automatically.

Prime timeNFC in the car holds much promise, but when, exactly, will it be ready for prime time? Pretty soon, as it turns out. In a recent article, “NFC looks to score big in cars,” Automotive Engineering International points to several vendors, including Broadcom, NXP, Melexis, Texas Instruments and ams AG, that have either announced or shipped automotive-grade NFC solutions. NXP, for example, expects that some of its NFC tags and chips will first go into production cars around 2016.

Mind you, NFC isn’t just for phone pairing. It can, for example, enable key-fob applications that allow phones to store user preferences for seat positions and radio stations. It can also enable use cases in which multiple drivers operate the same vehicle, such as car sharing or fleet management. The important thing is, it’s moving from concept to production, marking one more step in the seamless integration of cars and smartphones.



Did you know…
  • BMW embeds NFC tags not only in its cars, but also in print ads.
  • IHS has predicted that, in 2018, global shipments of NFC-equipped cellphones will reach 1.2 billion units.
  • NFC World publishes a living document that lists all of the NFC handsets available worldwide.

Report from CTIA Wireless: Apps in the Car

You wouldn’t think that CTIA Wireless, a mobile show, would be a good venue for a car guy. But automotive journalist Doug Newcomb put together a set of panels that managed to attract everyone from the automotive industry who attended the show.

I met a good number of friends from a variety of automakers, tier one suppliers, and hardware and software vendors. I also had the distinct pleasure of participating in one of Doug's panels, which was moderated by Damon Lavrinc of WIRED.

The topic was the future of apps in the car, and it generated a spirited discussion. Panel participants included Geoff Snyder from Pandora, Michelle Avary from Toyota, Henry Bzeih from Kia, and Scott Burnell from Ford — all experts on the topic.

Andy speaking on the
apps panel. Videos of all
the panels are now online.
In general, we agreed: apps are coming to the car. They have already arrived in several cases, and it’s only a matter of time before they come to mass-market vehicles. And apps are not for North American alone: it's a worldwide phenomenon.

Mind you, we engaged in lively debate on a number of questions: What role does the mobile app developer play? How to deal with the fragmentation caused by different OEM app platforms? How to deal with driver distraction? And when will the "one man app" ever make it into the car? We all had good and varied opinions on these topics, and the session was very well received by the audience.

Derek Kuhn, QNX vice president of sales and marketing, also participated in a panel session, titled "Can we all just get along… for the consumer's sake?". That panel focused on how the industry as a whole can create a more seamless experience for the consumer. Derek's co-panelists included Mark Harland from GM, Leo McCloskey from Airbiquity, Brian Radloff from Nuance, and Niall Berkery from Telenav.

Did I mention? Videos of all the panels are now on Doug Newcomb's website — check them out!
 

Sunday, June 28, 2015

AUTOMOBILE Recall? What recall?

Red Bend demonstrates firmware-over-the-air (FOTA) updates of QNX CAR 2 application platform at Telematics Munich

I think anyone with a passing knowledge of software development in automotive would agree that the infotainment systems currently under development are light years ahead of the systems that shipped only 5 years ago. The blurring of the automotive and the consumer experience is accelerating at an amazing pace. And the processing power being specified for next-gen infotainment aligns with what is expected in advanced smart phones.

It's no surprise, then, that the size of the code base and the complexity of the underlying software is growing at a similar pace. This complexity creates a maintenance challenge. On your phone, upgrades are pushed out regularly in a way that you barely notice: you get a notification of an update, push a couple buttons, and presto, you are up to date. In automotive, if we stick to the traditional methodology, this same type of upgrade would require a recall. You'd have to take your car to the dealership and they would reflash whatever needs to be updated. Expensive for the auto manufacturer and a big pain for the consumer.

Thankfully, people are thinking about this. Companies like Red Bend Software have cut their teeth in the mobile space, specializing in firmware-over-the-air updates, or FOTA for short. They can generate something called a delta file, which effectively encapsulates the difference (or delta) between what is currently on the end device and the new software build. In some cases, the file can be up to 50 times smaller than the new build. They also have the ability to track current load status of all the devices deployed.

So what does that get you? Using FOTA, OEMs will be able to minimize the network bandwidth required for upgrades and to manage the update process remotely, moving us all towards that Zen state of automagic. I don't know about you, but anything that saves me a trip to the dealer is a good thing.

Red Bend will demonstrate this capability by updating versions of the QNX CAR 2 application platform this week at Telematics Munich. So if you happen to be there, do stop to check it out.

Autonomous, not driverless

Paul Leroux
I don't know about you, but I'm looking forward to the era of self-driving cars. After all, why spend countless hours negotiating rush-hour traffic when the car could do all the work? Just think of all the things you could do instead: read a novel, Facebook with friends, or even watch Babylon 5 re-runs.

Unlike Babylon 5, this scenario is no longer a page out of science fiction. It’s coming soon, faster than many imagine. That said, the story of the self-driving car still has a few unfinished chapters — chapters in which the human driver still has an important role to play. Yes, that means you.

As I’ve discussed in previous posts, the fully autonomous car is a work in progress. In fact, some of the technologies that will enable cars to drive themselves (adaptive cruise control, forward collision avoidance, etc.) are already in place. Moreover, research suggests that these technologies can, among other things, improve traffic flow and reduce accidents. But does that mean you will soon be able to sit back, close your eyes, and let the car do everything? Not quite.

Evolution, not revolution
If you ask me, Thilo Koslowski of Gartner hit the bull's eye when he said that self-driving cars will go through three evolutionary phases: from automated to autonomous to unmanned. Until we reach the endpoint, we should pay heed to the words of Toyota's Jim Pisz: autonomous does not mean driverless.

If planes can do it…
Some folks hear this and are disappointed. They point to auto-pilot technology in planes and ask why we can’t have driverless cars sooner than later. The argument goes something like this: "It's much harder to fly a plane, yet we have no problem with a computer handling such a complex task. So why not let a computer drive your car?”

If only life were so simple. For one thing, automakers will have to make autonomous cars affordable — doable but not easy. They’ll also have to negotiate a variety of legal hurdles. And in any case, driving and flying have less in common than you might think.

When you drive, you must remain alert on a continuous basis. Lose your attention for a second, and you stand a good chance of hitting something or somebody. The same doesn't always hold true in flight. When a plane is cruising at 30,000 feet along a proscribed flight path, the pilot can avert his or her attention for 5 seconds and incur little chance of hitting anything. In comparison, a driver who becomes distracted for 5 seconds is hell on wheels.

And, of course, auto-pilot doesn’t mean pilot-less. As Ricky Hudi of Audi points out, pilots may rely on autopilot, but they still retain full responsibility for flying the plane. So just because your car is on auto-pilot doesn’t mean you can watch YouTube on your tablet. Bummer, I know.

An alarming solution
Source: Modern Mechanix blog (and yes, that should 
read Frankfurt)

All of which to say, the driver of an autonomous car will have to remain alert most or all of the time — until, of course, autonomous vehicles become better than humans at handling every potential scenario. Now that could happen, but it will take a while.

It seems that someone anticipated this problem in the early 50s when they invented “alarming glasses” — take a gander at the accompanying photo from the August 1951 issue of Modern Mechanix.

Scoff if you will, but a kinder and gentler form of this technology is exactly what autonomous cars need. No, I'm not suggesting that scientists find a better way to glue wires to eyelids. But I am saying that, until cars become fully and safely autonomous, drivers will need to pay attention — after all, it’s tempting to drift off when the car is doing all the work. And, indeed, technologies to keep drivers alert are already being developed.

Pre-warned means prepared
Mind you, it isn’t enough to keep the driver alert; the car may also need to issue “pre-warnings” for when the driver needs to take over. For instance, let’s say driving conditions become too challenging for the car’s autonomous mode to handle — these could heavy rain, a street filled with pedestrians, or an area where lane markers are obscured by snow. In that case, the car can’t wait until it can no longer drive itself before alerting the driver, for the simple reason that the driver may simply take too long to assess the situation. The car will need to provide ample warning ahead of time.

The more, the better
That cars will become autonomous is inevitable. In fact, the more autonomous, the better, as far I'm concerned. Research already suggests that technologies for enabling autonomous driving can, in many cases, do a better job of avoiding accidents and improving traffic flow than human drivers. They also seem to do better at things like parallel parking — a task that has caused more than one student driver to fail a driving test.

But does this all mean that, as a driver, I can stop paying attention? Not in the near future. But someday.

Pandora interview: Using HTML5 to deliver content to the car

At CES this year, our own Andy Gryc had a chance to sit down with Tom Conrad, CTO at Pandora, a long-time QNX CAR platform partner. Pandora is already in 85 vehicle models today and continues to grow their footprint, not only in automotive but in consumer as well.

Take a couple minutes to hear Tom's perspective on standardizing on HTML5 across markets and to get a glimpse of the future of Internet radio in automotive. And make sure you watch the whole thing — there's some fun outtakes at the end.



Look ma, no driver!

Some of us talk about autonomous cars, some of us dream of owning one, and some of us actually get to ride in one. Andy Gryc is one of the latter. Head over to his blog to see a video he took while being chauffeured in a self-driving vehicle developed at the University of Parma — think of it as the ultimate in hands-free systems.

Would this be an awesome way to tour Italy, or what?

AUTOMOBILE New SAP video: the connected car means business

I always enjoy a good read on the connected car — a topic that is very near and dear to me. One of the latest articles in particular that excites me is a recent contributed piece for Forbes, “Can Connected Cars Help Change The World?” by Judith Magyar, executive office, product GTM and mobile division at SAP.

Why the excitement? Well, for one, the attention-grabbing headline is backed up by an insightful analysis of the promise of the connected car — even touching on the notion of the connected car as a means of environmental change and the four factors that are essential to this vision becoming reality. What’s more, Magyar uses QNX Software Systems’ very own concept car as an example of how the connected car and its real-world use cases are coming to fruition!

Want to see QNX and SAP’s collaboration in action? Check out this video, which shows how all of this technology would come together in one (very beautiful) vehicle:




The 10 qualities of highly effective hands-free systems

The first time I saw — and heard — a hands-free kit in action was in 1988. (Or was it 1989? Meh, same difference.) At the time, I was pretty impressed with the sound quality. Heck, I was impressed that hands-free conversations were even possible. You have to remember that mobile phones were still an expensive novelty — about $4000 in today’s US dollars. And good grief, they looked like this:



It’s almost a shock to see how far we’ve come since 1988. We’ve become conditioned to devices that cost far less, do far more, and fit into much smaller pockets. (Though, admittedly, the size trend for smartphones has shifted into reverse.) Likewise, we’ve become conditioned to hands-free systems whose sound quality would put that 1998 kit to shame. The sound might have been okay at the time, but because of the contrast effect, it wouldn’t pass muster today. Our ears have become too discerning.

Which brings me to a new white paper from Phil Hetherington and Andrew Mohan of the acoustics team at QNX Software Systems. Evaluating hands-free solutions from various suppliers can be a complex endeavor, for the simple fact that hands-free systems have become so sophisticated and complex. To help simplify the decision process, Phil and Andrew have boiled the problem down to 10 key factors:

  • Acoustic echo cancellation
  • Noise reduction and speech reconstruction
  • Multi-channel support
  • Automatic gain control
  • Equalization
  • Wind buffet suppression
  • Intelligibility enhancement
  • Noise dependent receive gain
  • Bandwidth extension
  • Wideband support

Ultimately, you must judge a hands-free solution by the quality of the useful sound it delivers. By focusing on these 10 essentials, you can make a much sounder judgment (pun fully intended).

Recently, Electronic Design published a version of this paper on their website. For a longer version, which includes a decision checklist, visit the QNX download center.

HTML5 blooper reel

I find bloopers infinitely amusing — mind you I’m talking about those that come on a reel, not those that happen for real. Missed deadlines, cost over-runs, IP disputes — these are the bloopers we all could do without.

Helping customers avoid bloopers is what we do — so to speak. Except it seems, when we put them in front of the camera. <grin>

Seriously though, no customers were hurt in the making of this video.



This compilation of bloopers from the HTML5 series highlights the professionalism of QNX customers, partners, and employees as well as their good nature.
 

Biff! Bap! Ker-Pow! It’s the BatBerry interview!

Paul Leroux interviews Tim Neil, a director of product management at RIM, who is building his very own Batmobile™. This project might sound like fun (and Tim assures us it is), but it also demands a wealth of skills, from welding to HTML5 programming.


Tim Neil
Tim, could you give us a quick overview of the BatBerry project?
The BatBerry combines my love of cars, Batman, and technology. I’ve always wanted to build this car and I’ve had a couple of unsuccessful attempts at creating a carputer. When RIM started creating a 7" tablet, I knew the time was right to bring all of these interests together.

How did you get started on this project?
I started my research about 15 years ago, trying to determine how and where to get started. For instance, I needed to track down the shifter, which is a throttle quadrant from a WWII US Navy bomber.

By 2010, I had finished modifying my custom Subaru WRX, and I needed to get started on something new — working on cars is my way of escaping and relaxing. The time was right, and I got the green light from my wife. Luckily for me, she knew of my desire to build this car when we met and it didn’t scare her away. :-)

The BatBerry, about a year after Tim launched his project
Reading your blog, I’m totally impressed by the scope of the BatBerry project — be it creating dashboard panels, writing control software, or building a retractable license plate. Do you do most of the work yourself?

Yes, I try to do as much of the work myself as possible. I leave important things that I don’t have experience in, like doing the frame stretch, to the professionals. I did the same thing building up my Subaru over the past 7 years: learning how to do body work, interior, stereo, engine modifications, etc. I like to learn things as I go and I’ve always had a knack for figuring out how things work. I always figure, what’s the worst thing that can happen? If screw up, I just have to try again.

To pull this off, you need to be a jack of all trades. I’m sure you had skills to begin with — but did you also have to pick up any along the way?
Welding is one of the biggest skills that I’ve picked up so far. I bought myself a welder, watched a couple of YouTube videos, and got to work. I can tell you, my welds look MUCH better now than my first ones. From all the welders I’ve talked to, it’s a skill that simply takes patience and practice.

Since I was a kid I have always been able to figure things out. When I was 8 years old I was wiring my bedroom up to have a switch on my headboard automatically open the door. The best way that I can describe to people how I see the world is by watching the movie Iron Man. When you see Iron Man’s computer JARVIS take an object and expand it out into a million pieces to show how it works, that’s what I see when I look at something.

Tim's other project a highly modified Subaru WRX
What kind of power plant does the BatBerry use? Have you modded it?
The car currently has a 305 4.3L L99 V8. I haven’t really modified it yet. I will likely go with a re-built version of the same engine so that I can re-use the ECU. I’m not looking to make this car into a high-performance hot rod — that’s where my Subaru comes in. Plus, it’s nice to drive distances not always looking for a gas station that serves 94 octane. :-)

The V8 puts out 200hp, which should be pretty good for the BatBerry, considering it is basically a frame with a 400-pound fiberglass body mounted to it. As long as it sounds nasty I’ll be happy. I have a couple of Flowmaster 40 series mufflers for it.

Anyone who reads this blog knows we are bullish on HTML5. So I was fascinated to hear that the BatBerry project has an HTML5 connection. Could you tell us about it?
As the former development manager for BlackBerry WebWorks at RIM, I wanted to show what could be done with HTML5 technology. I wanted to build an interface on my PlayBook and BlackBerry Smartphone that could control some of the systems of the car.

I also wanted to share as much code as possible between the Smartphone and PlayBook, and using WebWorks and HTML5 allows me to do this. These devices pair with a Bluetooth connection on an Arduino board to control a series of relays that raise and lower the 30-cal machine guns, open and close the canopy, raise and lower the suspension, and perform other functions.

All the source code for the project, including Arduino microcontroller code, is being shared in my BatBerry repo on github.


Sample screen captures of the BatBerry user interface

What has been your greatest challenge? And what are you most proud of, so far?
My biggest challenge has been finding time! I’ve been travelling for work more on weekends and while this winter was pretty mild, it was still a bit hard to head out into a freezing cold garage to put in a couple hours of work during the evenings.

I would say the two things I’m most proud of so far are my welding skills and my dash panels. I really wanted to give back something to others who have been building their own versions of this car. Screen-accurate dash panels were something missing from the community. In general, I really like to share what I’m doing so that others who want to do something similar can see what worked, and what didn’t work, for me.

The Discovery Channel has been tracking the BatBerry project. Do they plan to broadcast anything soon?
Nothing to air at the moment. The next step will be to get updated footage of some of the technology integration points. I’m getting close to being able to show the combination of HTM5, Arduino, and the machine guns to get some new footage. Once we reveal the car, filming will wrap up and go into post-production for airing sometime in the future on Daily Planet.

When you aren’t working on the BatBerry, what do you do?
I spend my spare time hanging out with my family, doing something with cars, or playing with technology. My daughter is a big Star Wars fan so she and I have been having some epic lightsaber battles lately. I’ve done a lot of car shows in the past with my Subaru and I really like meeting up and trading experiences with the car community around Toronto. At RIM, I direct the product management group responsible for developer tools, APIs, and SDKs — our focus is on removing barriers and adding features to make developers successful.

One more question: Which Batman character do you most identify with?
I would say Batman himself. While I’m not on the tipping point of insanity and looking to be a vigilante, I identify with the desire to make a difference. I also relate to the do-it-yourself attitude and the love of cool tech and cars. Plus, I’m just a geek at heart. :-)



To track the progress of the BatBerry project, check out Tim’s blog. You can also follow him on Twitter.

And while you’re at it, visit Tim’s YouTube channel. Here, for example, is a video showing the BatBerry’s replica machine guns:




Neither Tim Neil, his vehicle, nor Research In Motion (BlackBerry) are licensed by, endorsed by, sponsored by or affiliated with DC Comics or the owners of the “Batman” properties.