Navigation

Simply Talk to Your Car.

Live Like the Jetsons.

Voice recognition software has made huge leaps in terms of accuracy over the last 6 years. In 2010, accuracy of communication was still around 70%. By the end of 2016, it should have just clinched the 95% mark – this is at least the view of Mary Meeker, a prominent Silicon Valley venture capitalist. Voice-activated digital assistants are one of the main pillars of the smart environment: connecting cars, houses and mobile devices and allowing them to communicate with each other.

The Voice as a Remote Control.

An insight into the worry-free daily life that can be achieved when you control your environment with your voice could be seen in every corner of CES. At practically every stand it became clear that voice-activated services will become the norm in the future. Your car knows the best route home. Your refrigerator knows the healthiest recipe for a good dinner with friends and family. And the easiest way to get your children to go to bed might be with their Aristotle. This device not only plays their favourite songs, but also lets the whole family doze off in soft lighting ambience.

Everyday life in the future will probably be without mechanical buttons and input devices. We will control the artificially intelligent environment around us with voice and gestures alone.

1 Device, 7000 Skills.

The backbone of intelligent personal assistants consists of machine learning and highly developed artificial intelligence (AI). These constitute the basis for petabytes of data, which are formed within a matter of seconds, being processed into useful information for the user and provider. Increasingly faster computers and ever more efficient algorithms are constantly generating more consolidated data patterns, ensuring that (virtually) everything can be controlled by voice. “Machine learning forces us to develop software in a totally new way. Software is constructed through data nowadays.” explains Riegel Smiroldo, Head Engineer of the Machine Learning department at Mercedes-Benz in Sunnyvale. Amazon’s Alexa taking on a leading role among digital assistants can be attributed to the company’s willingness to make its technology available to external developers, even in its nascent stage.

In this context, Amazon has developed guidelines that facilitate the integration of Alexa into devices from external providers. At CES 2017, the majority of all the exhibited products were already able to integrate Alexa, so it is no surprise that the number of Alexa Skills has increased from 1,000 to 7,000 in the last 7 months alone – Skills are apps that react to voice commands.

Amazon’s Alexa is the current trailblazer among voice-activated digital assistants. But Microsoft, Google, Apple and Samsung have also launched their own solutions on the market. This year alone, there are hopes that 10 million voice-activated digital assistants will be sold. The research institute Gartner predicts that up to 30% of digital integrations will be voice-activated by 2020. By that time, the market for digital assistants is forecast to reach 3.6 billion US dollars.

Talking to Your Car as the Norm.

Alexa was originally used in Amazon Echo, a cylindrical speaker for the home. Its basic skills include creating to-do lists, playing Spotify playlists, securing the smart home, and controlling intelligent household appliancesthermostats and light bulbs.

At CES, a number of different car manufacturers announced that they now intend to make the on-board computers in their vehicles accessible to Alexa – especially Ford and VWNissan and BMW also presented their integrations of Cortana, which is the intelligent personal assistant from Microsoft. Daimler AG was one of the first OEMs to integrate Google Assistant, announcing its integration into its Mercedes-Benz models before CES 2017. This enables destination data to be communicated to the car via the intelligent speaker Google Home. In addition, this can be used from home to start the air conditioning in the car in advance or check the petrol level.

In the future, every Mercedes-Benz should be part of an intelligent ecosystem that can also be voice-controlled. In some markets, Mercedes-Benz customers will already be able to communicate with their vehicles using Google Home or Amazon Alexa from this year. Daimler AG places importance on absolute integrity here when it comes to protecting user data. Confidential, personal information collected by the artificial intelligence in the car stays in the car. It will not be stored on other devices or in the Cloud. Only general, non-personal data is uploaded to the Cloud and analysed, e.g. to enable car-to-car communication.

Lip-Reading Commands.

 During the “Inspiration Talk: Connectivity”, the CEO of NVIDIA, Jen-Hsun Huang, spoke as things stand with Mercedes-Benz about the increasing merging of reality and virtuality, which is already starting to make people’s lives easier.

NVIDIA’s Co-Pilot was cited as an example of this. This collects data from sensors which are positioned on the inside and outside of the car. A profile of the car’s environment is created from this. On this basis, drivers can be informed about how their driving behaviour can be improved. Thanks to artificial intelligence, Co-Pilot also reacts to voice commands such as “Take me home” and is even able to lip-read the driver’s commands. Could the future sound any better?