Automotive E-ssentials

Automotive E-ssentials

Your regular update for technical and industry information

Your regular update for technical and industry information

Homologation and validation of automated driving functions

Fewer accidents, increased efficiency, better environmental compatibility: Automated driving functions (AF) are one of the most important innovations in the history of mobility. But automated driving functions must comply with strict regulatory requirements to ensure safety and functionality. They are laid down, inter alia, in UNECE R79 of the United Nations Economic Commission for Europe. Together with AVL we show the example of this regulation, how relevant concrete scenarios for the validation and homologation of vehicles with automated driving functions are determined and how a validation framework and corresponding tools optimally support homologation.

Homologation and Validation of Autonomous VehiclesProof of Safety

The rapid progress in automated driving necessitates the continuous improvement of algorithms. One of the biggest challenges is the proof of "safety". Higher levels of automation according to SAE J3016, 2016 require an increasing number of tests for approval or certification. The biggest leap for homologation takes place in the transition from assisted (SAE Level 2) to automated driving (SAE Level 3), as the responsibility of the driving task is transferred to the system. In this context, the safety and functionality of the AF for the defined use case must be guaranteed and confirmed by an independent body. Countless different driving situations have to be considered which makes the test scope for the approval of such systems for AF more complex [1]. This article is intended to illustrate the topic using UNECE R79 [2] as an example. TÜV SÜD is working on the aspect of homologation methodology, with a focus on feasible and efficient approaches. For these approaches AVL is developing a consistent, open validation framework and the corresponding tools to support homologation in the best possible way.


Homologation describes the process of testing, approving and certifying products produced in series to obtain approval for a specific market. Depending on the country, the manufacturer itself (self-certification) or an independent designated technical service such as TÜV SÜD (3rd party system) can provide proof of compliance with regulatory requirements to the type approval authority. The latter system is established in Europe [3].

Regulations (e.g. UNECE R139) describe driving manoeuvres for the homologation of vehicle dynamics relevant functions, such as brake assist systems (BAS), which are intended to prove the suitability of the systems. Other methods are required for the safety verification of AF. Scenarios are the central element to prove the correct decisions in interaction with the AF environment. Much discussed point is the handling of the immense number of possible traffic situations and a practicable homologation [3].

Generic Method and Scenario Selection

A promising approach to address the challenge of the number of situations is scenario-based testing. Assuming that the majority of situations are non-critical, scenario-based testing is limited to relevant events (scenarios). A framework for this approach was developed e.g. within the German funding project PEGASUS [4].

Validation and homologation already use scenario-based tests. In homologation, the scenarios are defined together with KPIs and criteria from the relevant regulations, e.g. UNECE R79. The most challenging aspect of scenario execution is the uncertainty of the scenario parameters. These are usually not specified in detail and must be defined for the independent evaluation of the system in consultation with the technical service.

The main task of validation is to select all relevant scenarios and corresponding parameters. A promising approach is defined in [5]. Figure 1 shows an overview of the procedure.

Overview of the scenario-based process

First, the approach uses logical scenarios from the regulations. The goal is to limit the parameters of the logical scenarios to identify the most relevant test. Based on the three elements shown in Figure 1, the parameters of the logical scenarios are optimized. The first two elements focus on the identified weaknesses of the system under test (SUT) and enable system-specific scenario definitions. To increase the difficulty, the system-independent complexity is increased by further road users. A parallel consideration of the elements during the optimization is necessary.
The results are relevant specific scenarios for the validation and homologation of vehicles with AF. The unique aspect of the approach is a system-specific adaptation of the relevant scenarios to the weak points of the considered system.

The efficient execution of the scenarios presents a further challenge. Especially in the validation of AF, the number of scenarios and parameters is tremendous, so that virtual methods will increasingly be used.

Validation of models and tool chain

If virtual methods are used, they must first be validated [6]. It is shown that the models and tool chain used correlate sufficiently with reality.

Different virtual methods are shown in Figure 2.

Use of virtual methods for homologation and validation

Highly simplified simulation models are not enough for system validation or homologation. On the one hand, the models must map the system behavior in detail, on the other hand, it must be ensured that different models and tools are numerically correctly linked.
Another challenge is the certification of the tool chain [3]. The complexity and requirements of a system simulation are demonstrated by using the Highway Pilot for commercial vehicles as an example. For validation and approval at system level, various subsystems must be available in a sufficiently high level of detail, e.g.:

  • A steering model for lane keeping and following of the planned trajectory
  • A brake model for emergency braking
  • Various vehicle dynamics functions that the AF uses (e.g.: ABS controller)
  • Sensor models
  • etc.

The validation procedure is based on the future Framework Directive (EU) 2018/858 of the European Commission.

Compared to the virtual homologation of the Electronic Stability Program according to UNECE R140, the process for AF is much more complex. There are more relevant sub-systems and a more comprehensive Operational Design Domain (ODD). In addition to the actual validation tasks, virtual system integration plays an important role in building the tool chain. A separation between model and model parameterization is mandatory. Before using the environment for AF validation and homologation, an integration and validation process must be completed. Parameter and model data management is required for traceable documentation of the virtual tests.

Integration and validation of the system simulation tool chain

Figure 3 illustrates the procedure again in detail. The ODD and the associated scenarios are just as important when defining a simulation environment as they are when developing AF. The level of detail of the simulation models is derived from the ODD. When defining levels of detail for simulation models, the ISO 11010-1 that is currently being developed can be used.

Like the vehicle development process, the extended use of simulation at system level requires integration competencies for the virtual vehicle. The system simulation is thus characterized not only by the fact that the subsystem models are valid, but also by the fact that the simulation at system level is valid. In addition, aspects such as numerically correct co-simulation between different tools and models as well as the integration of real and virtual components play an important role [7].

The validation of the models and tool chain at system level is therefore the responsibility of the system simulation.

Special driving scenarios are defined for the comparison between reality and simulation. These differ from the actual scenarios used to validate the AF. Basically, a distinction is made between the validation of the passive (switched off AF) vehicle and the active (switched on AF) vehicle.

Different methods are used to compare reality - simulation: defined tolerance bands for time data (e.g. [8]), comparison of state changes or KPIs. Statistical analyses such as regression analyses are also included.

Example of Method Application

In this section, the method presented is applied to UNECE R79. This regulation includes e.g. safety requirements for steering assist and assistance systems (ADAS).

An assistance system is the Automatically Commanded Steering Function (ACSF), which automatically intervenes in the steering to assist the driver while driving. The UNECE divides ACSF into further subcategories, taking into consideration "keeping the vehicle within the selected lane" (B1).
The following scenarios are defined for B1:

  • Lane keeping functional test (UNECE R79 Annex 8, 3.2.1 / Test 1)
  • Maximum lateral acceleration test (UNECE R79 Annex 8, 3.2.2 / Test 2)
  • Overriding force test (UNECE R79 Annex 8, 3.2.3 / Test 3)
  • Transition test; hands-on test (UNECE R79 Annex 8, 3.2.4 / Test 4)

Each test must be performed in 4 speed ranges (10 - 60, 60 - 100, 100 - 130, >130 km/h) and with defined minimum and maximum lateral accelerations. The AF must meet the corresponding requirements for the entire range, from which the test plan is developed. Table1 [1] shows an example of the final test plan with the scenario parameters and the distribution to suitable test instances.

Test plan and scenario parameter distribution to different test instances

In order to meet the requirements of UNECE R79, a real test was performed for each scenario and speed range. Further real tests were carried out to validate the model quality of the virtual tests. The Vehicle-in-the-Loop method (ViL, e.g. the AVL Drivingcube™) enables safety-critical tests to be carried out at high speeds, including the entire signal chain and deceleration times with the vehicle in a virtual environment. In addition, the parameter variations can be applied over a wide range (different speeds, curve radii, etc.) using ViL approaches. Since simulation is a scalable test method and to complete the test coverage, many parameter combinations have been distributed to this test instance.

The following pass/fail criteria are used for the safety assessment of the SUT:

  • Test 1 - Non-crossing of the road markings, transverse pressure ≤ 5m/s²
  • Test 2 - Not exceeding the lateral acceleration limits and lateral pressure ≤ 5m/s²
  • Test 3 - Overdrive power of the system ≤ 50N
  • Test 4 - Visual and audible warning signals as well as the system ejection of the SUT.

After the individual tests (scenario parameters and test instances), the results are evaluated and used to calculate the specified run/error criteria. The final result is documented in a detailed report that can be used for homologation in the future [1].


For an efficient validation and homologation of AF, a combination of virtual and real tests is necessary in order to map the complexity and variety of situations to a very high level. Although there are still no applicable international regulations for AF, there are ongoing initiatives that support this method. e.g. at the United Nations, or alliances like IAMTS or ASAM.


[1]: C. Gnandt, T. Düser, Homologation and Validation of Automated Driving Functions – It’s all about an efficient method and process
[2]: UNECE, Addendum 78: UN Regelung 79: 4. Edition, 2018
[3] T. Düser, H. Abdellatif, C. Gutenkunst, C. Gnandt, Ansätze für die Homologation automatisierter Fahrfunktionen, ATZ Elektronik 14. Jahrgang (2019).
[4]: Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR), PEGASUS-Projekt, 2019, , aufgerufen 18.02.2019
[5]: T. Ponn, C. Gnandt, F. Diermeyer, An Optimization-Based Method To Identify Relevant Scenarios For Type Approval Of Automated Vehicles, 26. ESV 19 2019
[6]: H. Abdellatif, C. Gnandt, Einsatz der Simulation für die Homologation automatisierter Fahrfunktionen, ATZ elektronik, Ausgabe 12, Dezember 2019
[7] M. Benedikt, D. Watzenig, J. Zehetner: Functional Development of Modern Control Units through Co-Simulation and Model Libraries, ATZ Electronic Worldwide, Volume 10, May 2015
[8] ISO/DIS 19365 Passenger cars — Validation of vehicle dynamic simulation — Sine with dwell stability control testing


Benjamin Koller: Coordinator Technical Regulations and Knowledge Management Automated and Connected Driving, TÜV SÜD Auto Service GmbH, Garching
Dr. Tobias Düser: Department Manager Advanced Solution Lab, AVL Deutschland GmbH, Karlsruhe

For more information please contact our expert Benjamin Koller.


Automated driving requires international regulations

Automated driving requires international regulations

A look at the current state of developments

Learn More

More Than Just Compliance Providing High-Quality Battery Systems in xEVs

Providing High-Quality Battery Systems in xEVs

Ensure high performance and reliability of innovative battery Systems.

Learn more

Homologation of Automated Vehicles: The Regulatory Challenge

Homologation of Automated Vehicles: The Regulatory Challenge

A six-point approach for developing a regulatory framework.

Learn more

Automotive Guide to CCC

Automotive Guide to CCC

Top 5 questions on the certification & process for exporting automotive products to China.

Learn more

Automotive Guide to INMETRO

Automotive Guide to INMETRO

Top 5 questions on the certification & process for exporting automotive components to Brazil.

Learn more

Cyber security threats of autonomous and connected vehicles

Cyber Security Threats of Connected Vehicles

Consequences and safety solutions

Learn more

Automotive wireless connectivity

Keeping it connected: Wireless technology for automotive

Ensure road safety with increasing connectivity

Learn more



Site Selector