The UZH-FPV Drone Racing dataset is the most aggressive visual-inertial odometry dataset to date. Large accelerations, rotations, and apparent motion in vision sensors make aggressive trajectories difficult for state estimation. However, many compelling applications, such as autonomous drone racing, require high-speed state estimation, but existing datasets do not address this. These sequences were recorded with a first-person-view (FPV) drone racing quadrotor fitted with sensors and flown aggressively by an expert pilot. The trajectories include fast laps around a racetrack with drone racing gates, as well as free-form trajectories around obstacles, both indoor and out. We present the camera images and IMU data from a Qualcomm Snapdragon Flight board, ground truth from a Leica Nova MS60 laser tracker, as well as event data from an mDAVIS 346 event camera, and high-resolution RGB images from the pilot’s FPV camera. With this dataset, our goal is to help advance the state of the art in high-speed state estimation.
All datasets are provided in two formats: text files and binary files (rosbag). While their content is identical, some of them are better suited for particular applications. The binary rosbag files are intended for users familiar with the Robot Operating System (ROS) and for applications that are intended to be executed on a real system.
The rosbag files contain images and IMU measurements using the standard sensor_msgs/Image and sensor_msgs/Imu message types, respectively. The events are provided as dvs_msgs/EventArray message types, and the ground truth is provided as geometry_msgs/PoseStamped messages. The Events/IMU/GT bag files also contain the image frames from the mDAVIS as sensor_msgs/Image messages. Note that some ground truth files are withheld for benchmarks and competitions.
We provide the calibration parameters for the camera intrinsics and camera-IMU extrinsics in YAML format, as well as the raw calibration sequences used to produce those with the Kalibr toolbox.
This work was supported by the National Centre of Competence in Research Robotics (NCCR) through the Swiss National Science Foundation, the SNSF-ERC Starting Grant, and the DARPA FLA Program.
This work would not have been possible without the assistance of Stefan Gächter, Zoltan Török, and Thomas Mörwald of Leica Geosystems and their support in gathering our data. Additional thanks go to Innovation Park Zürich, and the Fässler family for providing experimental space, iniVation AG and Prof. Tobi Delbruck for their support and guidance with the mDAVIS sensors, and Hanspeter Kunz from the Department of Informatics at the University of Zurich for his support with setting up this website.
@inproceedings{Delmerico19icra,
author = {Jeffrey Delmerico and Titus Cieslewski and Henri Rebecq and Matthias Faessler and Davide Scaramuzza},
booktitle = {{IEEE} Int. Conf. Robot. Autom. ({ICRA})},
title = {Are We Ready for Autonomous Drone Racing? The {UZH-FPV} Drone Racing Dataset},
year = {2019}
}