Back
Back
  • The Mercedes-Benz Vision URBANETIC people-mover module.
    1

    Successful autonomous driving – A pilot project by Daimler and Bosch.

    In 2019, Daimler and Bosch will test fully automated technology in city traffic in a pilot project. The car will perceive its surroundings and look for a safe route.

Safety is the top priority.

People who drive conventional vehicles in heavy urban traffic actually perform an extraordinary feat: they continually monitor their surroundings, make decisions in fractions of a second, operate the vehicle, and focus on the route that needs to be taken. Autonomous vehicles need to be able to do the same things. The question is whether the technology will be able to perform as well as — or even better than — humans. Michael Hafner, Head of Automated Driving at Daimler, offers a clear-cut answer: “We are moving toward the objective of automated driving systematically, and also more quickly than many people believe.” The most important thing, however, is to “introduce a safe and reliable system that can be mass-produced. Indeed, safety is the top priority and the common thread for all aspects and development stages as we move toward series production. Whenever there are doubts, our policy is that being thorough is more important than being quick.”


With the Intelligent World Drive, Mercedes-Benz is testing automated driving functions on five continents using a test vehicle based on the S-Class.
TecDay #urban automated driving Immendingen 2018: Cameras allow the vehicle to read traffic lights and road signs.

Cities as a testing environment.

Fully automated driving will become a reality when Daimler and Bosch launch a pilot project in Silicon Valley in California in the second half of 2019. The two companies are thus accelerating the development of fully automated driving and driverless vehicles. The goal here is to introduce automated systems in production vehicles by the beginning of the next decade.

The use of cities as a testing environment represents an extremely complex undertaking for the two development partners, as cars, commercial vehicles, pedestrians, cyclists, skateboarders, and even pets travel around urban areas at close quarters and in confusing situations. Some of these traffic participants will never be equipped with the type of technology that enables direct networking with other participants — and development activities are focused on exactly this aspect. More specifically, systems for fully automated driving will not only have to manoeuvre vehicles correctly; they will also have to use their sensors in order to monitor the complete surrounding environment and take into account the interests of every vehicle or individual that crosses their path. Once all of these things can be done, streets and roads will be safer for everyone. “We continue to pursue our vision of accident-free driving,” Hafner explains, “and this ambitious goal can only be achieved through many small steps that lead to fully automated vehicles.”


High technical requirements.

So, how does fully automated driving actually work? Put simply, an automated vehicle knows its starting point and destination and can also make use of detailed high-resolution maps. It is also equipped with sensors that can detect objects and people in the surrounding area, as well as technical systems that enable it to steer, accelerate, and brake without any action on the part of a human driver. An onboard computer calculates the best route. After it drives off, an automated vehicle monitors its surroundings like a human driver as it moves along and then independently determines which actions need to be taken. This means it can respond flexibly — for example, it can evade a ball that rolls onto a street or brake in time to avoid a collision.

It all sounds simple, but the technical requirements are extremely challenging. “A modern office computer is a very powerful and compact computing unit,” says Theresa Kienle, who works on autonomous driving systems development at Bosch. “Taken together, all the devices in a fully automated vehicle have a computing capacity that’s six times higher than that of an office computer, so such a vehicle can perform trillions of operations every second.” Daimler and Bosch are working on hardware systems for automated driving with Nvidia, a North American company that specialises in image processing and artificial intelligence.


With the Intelligent World Drive, Mercedes-Benz is testing automated driving functions on five continents using a test vehicle based on the S-Class.

Reliable environment recognition as the basis.

In order to develop fully automated driverless vehicles that can operate in urban environments, you need to have systems equipped with sensors that can reliably detect and recognise objects. Here, cameras, radar, ultrasound, and lidar serve as a vehicle’s “sensory organs.” Each sensory device has a specific talent, so to speak: cameras can recognise colours and using two cameras enables stereoscopic vision. Radar works rapidly and can even “see” underneath cars. It also has a range of up to 250 metres. Ultrasound, on the other hand, is very good for monitoring short distances around the vehicle. Lidar (Light Detection and Ranging) uses a laser beam to create a highly precise 3D measurement of the distance to the object recognised, its position, and its height.


All sensors continually monitor the vehicle’s entire surroundings in real time. The data they produce is put through a sensor fusion process that calculates an extremely exact model of the surrounding environment within milliseconds and precisely plans the vehicle’s route. Daimler and Bosch are jointly developing the algorithms needed for this. Such a system produces a huge amount of data. A stereo video camera alone generates around 100 gigabytes of data for every kilometre driven. The model of the surrounding environment that’s produced meets the stringent safety standards at Daimler and Bosch, which in the opinion of both companies is a basic requirement if fully automated vehicles are to become a reality. In order to ensure maximum reliability, various circuits perform the required computing calculations in parallel.


TecDay #urban automated driving Immendingen 2018: Cameras are the eyes of the car and allow the vehicle to read traffic lights and road signs, to differentiate road from walkway and to recognise all road users safely.

Machine learning.

The development work involves neural networks, artificial intelligence, and machine learning (also known as deep learning). Developers feed the neural networks a huge number of different traffic situations. “In order to correctly assess traffic situations, the computer needs to have already seen many different situations and be able to correctly identify individual aspects of a given situation,” says Uwe Franke, Head of Image Understanding at Daimler. “Our engineers stipulate the curriculum here, so to speak, since the system doesn’t decide for itself that it should take a look at what’s beyond the next hill, for example.” With this approach, the systems learn which conclusions need to be drawn in each situation – exactly the way people do.

In one test, for example, the display on a developer’s computer shows a traffic situation that’s been detected and recognised by a lidar system during driving. Here, there are two pedestrians on a pavement, both of whom are displayed in red. A cyclist is also shown in red, while his bicycle is dark red. Cars are blue, trucks and other commercial vehicles are dark blue, and street lights are depicted in grey. All the other sensors also monitor the scene simultaneously and provide data.


Environment detection.

“Precise classification and the recognition of details represent a major breakthrough in this field,” says Markus Enzweiler, who’s responsible for lidar systems at Daimler. “A fully automated vehicle doesn’t just see objects; it also knows to a certain extent whether or not that object is a person or a vehicle, and even what type of vehicle, and it has knowledge of the characteristics of the objects it sees.”

The onboard computers can also forecast direction of motion. This type of information is essential, especially when autonomous vehicles drive on congested city streets. Is a pedestrian walking parallel to the road surface along the pavement? Is it to be expected that the pedestrian will try to pass another person on the pavement with a stroller and step onto the road surface in the process? Or does the pedestrian want to cross the street at a crosswalk ahead? The systems can also detect and recognise hidden individuals coming out from behind a parked car, for example. Gesturing also needs to be taken into account – for example, when a pedestrian about to cross a street waves at a vehicle to indicate that it’s okay to pass before they cross.


With the Intelligent World Drive, Mercedes-Benz is testing automated driving functions on five continents using a test vehicle based on the S-Class.
With the Intelligent World Drive, Mercedes-Benz is testing automated driving functions on five continents using a test vehicle based on the S-Class.

Different traffic situations.

Major successes have been achieved with autonomous driving systems as of late. “Recognition rates are now excellent in very different kinds of traffic situations,” says Markus Braun, an IT specialist who has been working on the development of autonomous driving systems at Daimler for three years now. The systems “have become more robust, as we put it, and the goal now is to reduce recognition errors to an absolute minimum.” In order to give the computers the broadest possible knowledge base and take into account regional conditions and characteristics, the neural networks have also been fed data from 12 European countries, says Braun: “After all, drivers, pedestrians, and cyclists in Rome act very differently than their counterparts in Berlin.”


All critical actuators and motion sensors must have backups.

All data is sent to an automated drive controller (ADC), the central computer in a fully automated driverless vehicle. The ADC makes decisions that it passes on to the motion control unit, which then operates the actuators – i.e. the steering wheel, the gas pedal, and the brakes. Daimler has clear rules for this: all critical actuators and motion sensors in a fully automated Mercedes-Benz must have backups, and this also includes all control devices and the power supply system.


For example, the steering system that’s operated with electric motors not only has two motors but also redundant power electronics. The pneumatic brake booster previously used has been replaced by the electromechanical iBooster from Bosch. The combination of the brake booster and the Electronic Stability Programme (ESP) creates a braking system that allows the vehicle to be stopped safely even if individual components fail.


TecDay #urban automated driving Immendingen 2018: Cameras allow the vehicle to read traffic lights and road signs.

Safe on the road.

A network of software modules spread out across several control devices manages the interaction of the actuators. This also includes recognition of and compensation for disturbance variables such as crosswinds, bumps in the road, and sudden slippery road conditions. This software module network can also safely brake and stop the vehicle if the commands from the ADC are interrupted or if specific driving instructions are deemed implausible. The system also continually generates a braking path which network components that are still operating properly can utilise if necessary.

All of this is extremely complex, but Daimler and Bosch both agree that anything less would be unacceptable. “The development of automated driving systems to the series production stage is like a decathlon,” says Stephan Hönle, Head of the Automated Driving unit at Bosch. “It’s not enough to be good in just one or two areas. Instead, you have to master all the disciplines, like we do. Only then will it be possible to ensure safe automated driving on the road and in the city.”