Measuring the world of the car

  • 26. February 2014
  • Autonomous Driving
  • Photos: Daimler
  • Text: Kai-Holger Eisele

Researchers at Daimler are working on the ability to determine the position of a vehicle with pinpoint accuracy.

Over the next decade, autonomous driving will gradually become a reality on our streets. Mercedes-Benz is slowly introducing new autonomous drive functions in its series models, which will enable cars to take over more and more tasks from the driver, making the job of driving easier in situations such as in traffic jams or on the freeway. The ultimate goal is a fully self-driving car offering the highest level of comfort and convenience for the occupants and maximum safety for all road users.


Until that point is reached, every effort is being made at Daimler to develop the wide range of technologies required to introduce fully reliable autonomous systems in everyday traffic situations. Sensor technologies – the hardware that accurately determines the immediate and wider environment of a self-driven car by way of a camera, radar and ultrasound – are key, as are information technology and intelligent data processing. The challenges are complex: the vehicle’s environment not only has to be detected, but recognized, interpreted and correctly ‘understood’ by the system as part of an continuous situational analysis. To ensure that this analysis is absolutely reliable, the data from the various sensors are not analysed individually but pooled together (‘sensor fusion’). Using this analysis, the driverless vehicle is then in a position to plan future manoeuvres and carry them out within milliseconds using electronically controlled steering, brakes and drive system. This has to happen much more quickly than if a human was driving the car.



It is absolutely critical to be able to locate the position of a vehicle on a map with complete accuracy in order for the autonomous drive system to analyse a situation and plan a manoeuvre. In other words: the vehicle can only make the right decisions and determine the optimal path to a chosen destination if it is ‘aware’ of its own location in the world. The technical requirements for identifying a vehicle’s location are extremely complex because conventional satellite-based systems that determine the position of a vehicle, such as GPS navigation systems, do not provide sufficiently accurate data. “GPS provides location information with meter accuracy – but, depending on the situation, we need an accuracy of no more than twenty centimetres for highly autonomous drive functions,” explains Martin Haueis, Head of Localization and Data Management at Daimler’s Corporate Research and Pre-development unit.


Take the example of an intersection: on the one hand, the autonomous vehicle has to drive far enough onto the intersection so that its sensor systems can reliably detect approaching road users even when visibility is poor. On the other, it must not drive so far onto the intersection that other road users are obstructed or, in the worst case, put at risk. Needless to say, location accuracy with a variance of several meters is by no means sufficient. The degree of precision required for identifying a vehicle’s location also depends on the environment in which the autonomous vehicle is travelling, says Haueis. Greater accuracy is required for complex traffic situations, such as in urban areas where cars have to make turns and there are sidewalks, traffic lights and roundabouts, than on freeways. Whatever the scenario, the system has little time to determine its location: this has to happen within a tenth of a second (100 milliseconds).



In future, highly accurate digital maps, which will contain far more information than navigation systems today, will be an important tool for defining a vehicle’s environment at a given position. Different layers of the map will hold information on lane patterns and lane widths and the location of traffic signs, traffic lights and buildings and with an accuracy of up to ten centimetres. These innovative maps are created with the use of special measuring vehicles, similar to ones already used by map manufacturers today. When a vehicle is being autonomously driven, it identifies its own location on the digital map. Landmarks stored on the map are used to calculate the exact position of the vehicle using ‘correspondence finding’, which is accurate to within centimetres. Once this happens, the sensor fusion and situational analysis system get information from the digital map about the road ahead and about traffic lights, pedestrian crossings, stop signs and more.


This means that a model of the environment is available to the car as it travels, which draws on information that is captured by the sensors and information from the digital map. This also balances out some of the weaknesses inherent in today’s maps and sensors. The dynamic information not included in the pre-recorded digital map data is provided by the sensors, while the limited visual range of the sensors – they cannot ‘see’ around objects that obscure the view, for example – is offset by the map, which already knows the course of a road around a bend.



Daimler specialist Haueis is thinking far beyond the technical limitations of today, however. “Our vision is to involve vehicles in the updating of digital maps,” he says, “so that vehicles keep the map information up to date by constantly submitting map changes and reporting roadworks or new roundabouts to the relevant back-end system.” At the same time, every vehicle benefits from the information it receives in the network. Today, direct communication between vehicles (Car-to-Car Communication) is already possible with sensor data about traffic jams or accidents being sent to cars behind – irrespective of whether they are driven autonomously or by a person behind the wheel.


The type of data that must be contained in the individual layers of the map for an autonomous car will be driven by the needs of sensor fusion, situational analysis and vehicle localization. For the successful test drives of the autonomous S 500 INTELLIGENT DRIVE research vehicle, which took place in late summer 2013 on the historic Bertha Benz route between Mannheim and Pforzheim, developers from Mercedes-Benz opted for a camera-based localization approach which used characteristic landmarks along the route as reference points. The pictures on the cameras used in the research vehicle – a stereo camera facing forward and a mono camera facing back – were continuously compared with the three-dimensional plan of the surrounding area on the map. To increase the robustness of camera-based localization methods, MEMS (microelectromechanical systems) are used in the vehicle. These are microscopic accelerometers which measure the route covered by the car and its direction of travel starting from a geographic reference point. MEMS sensors are required today for applications such as the deployment of a life-saving airbag and the ESP® Electronic Stability Program that can detect critical driving situations.


Even the tiniest details in the vehicle’s environment can be identified by comparing camera images and map: road signs and markings, sidewalk edges, window ledges, edges of buildings and much more. This combination of features gives every conceivable position along the route its own characteristic ‘face’. According to Martin Haueis, regionalization of the digital maps will present a particular challenge for the future. For example, a U.S. highway may look completely different to a German autobahn – and the localization has to work with one hundred percent reliability wherever the car is in the world.


This type of feature-based localization cannot be carried out by camera alone. The use of radar or environment modelling with laser scanners are possible alternatives, and the pros and cons of each approach is being investigated by Daimler’s research teams.


There is no doubt that the technological progress being made in the area of information and sensor technology will open up a whole new dimension in intelligent driver-assistance travel over the next ten years – all the way up to fully autonomous systems that can, if required, drive a car without any human intervention at all. Daimler research engineer Martin Haueis is already seeing things differently after working intensively on vehicle localization. In the development of the new digital maps for driverless vehicles, he says, “we are already beginning to see the world around us through new eyes.”



Calculating the ‘Ideal Line’


The computer-calculated path defines the exact boundaries of the space through which the driverless car can safely travel to a selected destination. The calculation is based on numerous parameters such as the current location, the lane width, topography, speed limits and the dimensions of the vehicle. Within this safe corridor lies the ideal trajectory of the vehicle – its precise path of travel in three-dimensional space over time, whereby potential obstacles must also be taken into account. How far in advance the path must be calculated depends to a large extent on the speed of the vehicle.



The eyes and ears of the car


The successful test runs involving the driverless S 500 INTELLIGENT DRIVE research car used only sensor technologies that are similar to those already being fitted in Mercedes-Benz production vehicles. Based on the Mercedes-Benz S-Class, this includes the two ‘eyes’ of the stereo camera next to the rear-view mirror; six radar sensors for mapping the areas in front of, behind and to the side of the vehicle, both in the near and middle distance; a total of twelve ultra-sound sensors; and four other cameras to provide the 360° camera system.


While the stereo camera can ‘see’ three-dimensionally within a range of up to 50 meters, the radar sensors of the S-Class can identify objects in a radius of up to 200 meters around the car. For the test drive undertaken by the S 500 INTELLIGENT DRIVE, the range of the stereo camera was extended so that it could identify objects from an even greater distance. A colour camera, not fitted in the production model of the S-Class, was also added for traffic light recognition.

Related topics.