Back
Back

Dashboard-Voodoo.

The dashboard of the future is alive.

Horses and Riders.

The term dashboard describes a computer interface that collates all relevant information and portrays it clearly. However, the word originally derives from a board at the front of a coach designed to protect the coachman from mud that was “dashed up.” The coachman steered his horses using spoken commands and reins. The animal reacted and communicated with its master through its body language. When engines finally replaced horses, this type of control had to be mechanically imitated. Instead of using handheld reins and commands, the steering wheel, gas pedal, and brake controlled the vehicle. When horsepower vanished under a hood, the dashboard was transformed from a mud barrier into a panel allowing drivers to see what the engine was doing.
This analog operating panel with its displays, buttons, and levers became the interface between the vehicle and its driver. This is a principle that still exists today. Only the name has changed; it is now known as a human-machine interface (HMI). However, with the advance of digitization, this interface is experiencing a transformation. The era when dashboards were merely lifeless instruments for displaying information is a thing of the past. With artificial intelligence under the hood, communication between the driver and vehicle is returning to its origins. Dashboards are becoming interfaces between people and machines that can finally be intuitively controlled: with voice commands, 3D gestures, and virtual feedback that feels real and alive.

Pixels instead of speedometer needles.

The dashboard of the future is a screen: screens as far as the eye can see. It is questionable whether actual buttons and switches will continue to exist in the future. Since smartphones have conquered the world with their multi-touch surfaces, virtual usage elements have been the measure of all things. Any displays can not only be rendered razor-sharp through pixels; they can also be controlled by hand. It isn’t even necessary to touch the display. Simple gestures are enough to operate assistance systems like Bixi, Chris, and Drivemode. These supposedly modern Tamagotchis are the intelligent descendants of satnavs. However, as they have to be stuck to the windshield or the center console and are not integrated components of the vehicle, they send out one message above all: This car is antiquated.

The next dimension in feeling.

Nowadays, 44 percent of the global population owns a smartphone. Swiping, zooming, scrolling, pinching, tapping, and clicking have long been part of the everyday vocabulary of the fingers. Smartphones were formerly mini-computers with multi-touch displays. Now manufacturers like Samsung are showing how the mini-computer can vanish completely beneath a curved display. Curved or foldable displays, richer colors, and sharper contrasts are a natural consequence of the continuous optimization of OLED display technologies. The vehicles of the future will naturally possess curved displays of this type. The ongoing development of electroactive polymers (EAPs) will also mean they can be experienced by touch; we will feel what we see. Back in 2013, Disney presented an algorithm that converts information from virtual surfaces into dynamic, tactile experiences. Apple already has various patents for screens that provide haptic feedback. The start-up Tanvas presented tactile screens at CES 2017. The concept of sitting in a car, ordering an item of clothing in an online shop, and simultaneously experiencing on the display what the material feels like is nolonger an illusion.

The language of hands!

Apple launched the iPod in 2001. Its click wheel became the new benchmark for user interfaces. It can be used intuitively and enables various controls via one control unit. A variant of this can now be found in almost every center console in cars. This interface could already have had its day just a decade later, as a wide range of developers have translated the principle behind this control unit into a new dimension of gesture control – without an actual click wheel. In the Shape Shifting Console from Harman, a controller is located under an apparently leather-covered armrest that reads hand movements and so guides the user through the menu. The start-up Ultrahaptics is developing similar technology.

BMW is also looking for new ways to make menu control easier using so-called 3D gestures. AirTouch technology was introduced in 2014, allowing users to control a menu on the display using gesture control without touching it. This concept was extended in 2017 with haptic holograms.

HoloActive Touch involves a display that floats around freely in space. It is operated via virtual touch through a gentle tap of the fingertips. This currently happens at a range of 20 to 30 millimeters above a glass panel. Not only will this be possible from a greater distance in future; users will also get a haptic impulse when navigating through menus virtually using their fingers.

The technologies behind such interfaces frequently originate not in the vehicle industry but in the wonderful world of computer games. Start-ups like Leap Motion, Gestigon, Heptasense, and eyeSight are experimenting in their own unique ways with 3D gesture technology.

The 360° cockpit.

The dashboard used to finish where the windshield began. However, this threshold will vanish more and more. Head-up displays (HUDs) already project information in front of the windshield in the driver’s field of vision. Konica Minolta has been developing 3D AR HUDs since 2013 that are able to display such projections three-dimensionally for the first time. In addition, automotive supplier Continental is looking to have market-ready AR HUDs by 2019.

In view of this technology, the question arises as to when the first smart windshields will appear. Jaguar presented a concept for this back in 2014 and has since been working on the abolition of the A-pillar. Smart front windshields are not only set to make vehicles safer; they will also provide a new channel for advertisers. Smart-glass manufacturer Corning Inc. wants to transform windshields into advertising panels. In the future, when self-driving cars have established themselves, the dashboard will not only be in front of the driver but all around them. It will not only serve to control the vehicle but will also provide a place for work, entertainment, and relaxation. A study by Panasonic shows how simply such moving display capsules work. Displays are virtually swiped through space from one surface to the next. This type of content agility is somewhat reminiscent of scenes from the film “Minority Report'.

Alexa, move the vehicle forward!

Apple bought Siri Inc. in 2010 and equipped its iPhone 4S with a voice assistant. Apple servers now process around 2 billion Siri requests per week. Since Amazon’s voice assistant Alexa was integrated into cars – if not before – it has been clear that voice control will be a key feature of the dashboards of tomorrow. Google is also looking to offer such services. On the one hand, the intelligent Google assistant will also be available in cars. On the other hand, Audi and Volvo will equip their cars with Android as the operating system in future. According to Ben Evans, an analyst at Andreessen Horowitz, one day we will simply climb into the car and say “I want to go home!” The vehicle will do the rest.

The return to the principle of horse and rider.

The dashboard of the future is a holistic digital user interface that can be used totally intuitively. This comes from the fact that the vehicle recognizes its driver and provides them with an individualized interface. Many manufacturers such as Faraday, Panasonic, and Airbus have already shown that it will be completely normal to establish the identity of the driver as the key to the vehicle. It is a little like the horse smelling its coachman long before his arrival and welcoming him with pleasure when he appears. Vehicles equipped with artificial intelligence will not only be able to speak exactly like people but will also react to the emotions of their driver. Start-ups like Affectiva have long offered the opportunity for machines to read people’s emotions so as to be able to communicate on a more human level with their users.

Authors: David Menzel and Jean-Paul Olivier