Not only do we offer probably the largest and most accurate vehicle trajectory dataset currently available, but we are also experts in using the dataset for a variety of automated driving applications. This enables us to offer you an extensive package of services around our datasets!
Individual Rights of Use
Acquire the rights to exclusively use a dataset and develop your own applications.
We offer analyses, evaluations and also the parameterization of models on the basis any of our dataset.
At your request we can offer you individual solutions.
We have the data, know its application and have defined the interfaces.
Application areas of drone data
Data-driven methods are nowadays an integral part in the field of automated driving. Many problems require extensive datasets of precise measurements of the movements and natural behavior of road users.
Validation of Sensor Systems
Automated vehicles use different sensors such as cameras, lidar or radar in combination. To verify and validate these sensors, reference measurement data is required that represents the actual environmental data as accurately as possible. However, the measurement methodology used to generate the reference data must be superior to the vehicle sensors.
Development and Parameterization of Models
Automated driving functions and the underlying tooling such as simulation environments are becoming increasingly complex. Handcrafted models are more and more complemented or even replaced by data-driven models. To derive these models, trajectory data are required, which contain the natural behavior of human road users.
Safety Validation of Automated Driving Functions
The development of highly automated driving functions includes their safety validation. As shown in the PEGASUS project, scenario-based approaches are promising. However, all scenarios that an automated vehicle can encounter in its operational area must be systematically modelled and their frequency quantified, which can only be done data-driven.
Extraction of Specific Scenarios
Example data of scenarios are often required for the development of driving functions. An example are trajectories of cut-in or turning maneuvers. The aerial perspective makes it possible to clearly identify and locate road users and road infrastructure for a subsequent extraction of scenario trajectories.
Scenario-based Analysis and Scenario Statistics
For a comprehensive safety validation and impact assessment, every possible traffic scenario in the operational domain of an automated driving function must tested. Drone data provide the optimal basis for this, since recorded trajectories can automatically be assigned to scenarios and scenario-specific parameters such as distances, speeds or metrics derived. These form a scenario database and are the basis for intelligent testing of automated driving functions.
Training data for prediction models
Every automated vehicle must not only precisely register its surroundings, but also predict the behavior of other road users. This is the only way to ensure a safe and collision-free trajectory. Therefore, data-driven models based on neural networks currently provide the best results. For the training of these models, trajectories of road users extracted from drone recordings are excellently suited, since positions, speeds and accelerations of each road have a constantly high quality.
Generation of reference data for sensors
Complex multi-sensor setups of automated vehicles need to be verified by reference measurements. By a high-resolution camera and deep learning algorithms, these ground truth data can be obtained from drone recordings in local or geo-referenced coordinate systems. During recordings, a drone can hover statically over a measuring location or intelligently follow a target vehicle.
Representation of relevant scenarios in the simulation
Drone recordings recorded traffic participants without necessarily focusing on a particular vehicle. Using the open standards OpenScenario and OpenDrive challenging scenarios can be re-simulated in detail. Driving scenarios can either be re-simulated as recorded or abstracted beforehand, allowing subsequent variation of the scenario.
Representation of relevant scenarios in the highly dynamic driving simulator
Re-simulations can be linked with a highly dynamic driving simulator for the investigation of vehicle passenger behavior and acceptance based on real traffic data. More information about the highly dynamic driving simulator of our partner ika can be found at.
Parameterization of Traffic Simulations
While the simulations of comparatively short concrete scenarios, e. g. of driving maneuvers, are of high relevance for safety validation, the simulation of one or more automated vehicles in a continuous traffic is important for the analysis of macroscopic effects. In addition to pure traffic densities, time-dependent, lane-specific statistics on vehicle composition, accelerations, distances and maneuvers such as lane changes can also be obtained from drone recordings.
Parameterization of Driver and VRU Models
The modeling and parameterization of drivers, pedestrians and other road users is a common task in the context of automated driving. The comparison of human driving behavior and performance with that of an automated vehicle is relevant for safety validation and impact assessment. With the help of drone data, detailed speed and acceleration information is available for every road user to derive scenario- and traffic infrastructure-dependent models.
Sensor Training Data Generation
In addition to validating sensor systems, reference measurements can also be used to generate training data for e. g. object detection systems based on lidar sensors. For this purpose, lidar point clouds can be automatically annotated by the accurate detections of road users in drone recordings.
Example Use: PEGASUS Research Project
PEGASUS delivers the standards for the automation of the future. With the PEGASUS joint project, promoted by the Federal Ministry for Economic Affairs and Energy (BMWi), key gaps in the field of testing of highly-automated driving functions will be concluded by the middle of 2019.