BMW Introduces Advanced Voice Command Technology with Expanded Gesture Control and Gaze Recognition

  • Based in Los Angeles, Warren Clarke loves providing readers with the information they need to make smart automotive choices. He's provided content for outlets such as Carfax, Edmunds.com, Credit Karma and the New York Daily News.

can be reached at wgcla@hotmail.com
  • Based in Los Angeles, Warren Clarke loves providing readers with the information they need to make smart automotive choices. He's provided content for outlets such as Carfax, Edmunds.com, Credit Karma and the New York Daily News.

can be reached at wgcla@hotmail.com
Share on facebook
Share on twitter
Share on pocket

When automotive voice command technology was first launched several years ago, it wasn’t exactly the most effective system on the market. The technology often had a difficult time recognizing simple commands, and there was an awful lot of over-enunciating required to get it perform the desired functions.

Things have changed a lot since then, and today’s automotive voice command technology gets more advanced with each passing year. BMW is the latest company to take this innovation in a boundary-pushing direction. The automaker recently announced the rollout of a new system that teams advanced voice command technology with expanded gesture control and gaze recognition.

This new amenity is called BMW Natural Interaction, and it was unveiled to the public at the Mobile World Conference 2019 in Barcelona, Spain. BMW Natural Interaction will initially be available on the automaker’s iNext, an electric SUV set to debut in the 2021 model year.

How Does it Work?

With BMW Natural Interaction, you can get your point across with a simple gaze or gesture. (BMW)

As its name suggests, BMW Natural Interaction is designed to facilitate communication that’s easy and organic. The driver can interact with the vehicle using voice, gestures and gaze – all at the same time or in various combinations.

The system can reliably understand voice commands, gestures and gaze direction, and it allows the driver to choose the preferred mode of operation. It can precisely detect hand and finger movements, as well as gesture direction and type.

Spoken instructions are processed using an intelligent learning algorithm called Natural Language Understanding. This algorithm is always being refined and updated, constantly processing complex information to generate accurate responses.

Since this technology offers a range of modalities, the driver can initiate vehicle functions in a host of different ways. For example, if the driver’s eyes are focused on the road, features could be initiated using speech or gestures. And if the driver is conversing with passengers, gesture and gaze control could be used to activate the desired amenity.

This system’s area of operation isn’t limited to the car’s interior. It allows the driver to interact with the exterior environment, including buildings and parking spaces. For example, you could get parking information by pointing at a lot and asking, “Can I park here and what does it cost?”

“Customers should be able to communicate with their intelligent connected vehicle in a totally natural way,” says Christophe Grote, Senior Vice President, BMW Group Electronics. “People shouldn’t have to think about which operating strategy to use to get what they want. They should always be able to decide freely – and the car should still understand them.
Thinking Ahead

A gesture-control camera allows the car to track the driver’s hand and finger movements in three dimensions. (BMW)

This innovation owes its accuracy to exciting new sensor and analysis technologies. An infrared light signal allows the system’s gesture-control camera to capture the driver’s hand and finger movements in three dimensions. And a high-definition camera nestled in the instrument cluster is able to quickly register head movements and gaze direction.

BMW’s Cote says this feature is “an important step for the future of autonomous vehicles, when interior concepts will no longer be geared solely toward the driver’s position, and occupants will have more freedom.”

We say, bring it on. This technology takes us one step closer to a world where man and machine enjoy seamless interaction.


About the Author

  • Based in Los Angeles, Warren Clarke loves providing readers with the information they need to make smart automotive choices. He's provided content for outlets such as Carfax, Edmunds.com, Credit Karma and the New York Daily News.

can be reached at wgcla@hotmail.com
Close Menu