Clione

TartanGround

Unverified
  • December 13, 2025, 08:51 PM
  • December 13, 2025, 08:48 PM
  • November 28, 2025, 11:14 PM
Last updated
Unknown
Release date
July 30, 2025
Size
1440000 samples | -- GB
License
CC BY 4.0
Tags
robot learning
SLAM
multi-modal
perception
ground robots
LiDAR
simulation

TartanGround is a large-scale, multi-modal dataset to advance the perception and autonomy of ground robots operating in diverse environments. This dataset, collected in various photorealistic simulation environments includes multiple RGB stereo cameras for 360-degree coverage, along with depth, optical flow, stereo disparity, LiDAR point clouds, ground truth poses, semantic segmented images, and occupancy maps with semantic labels. Data is collected using an integrated automatic pipeline, which generates trajectories mimicking the motion patterns of various ground robot platforms, including wheeled and legged robots. They collect 878 trajectories across 63 environments, resulting in 1.44 million samples.

TartanGround

Modality
image
Format
PCD
JSON
Source
Author
Manthan Patel
Fan Yang
Yuheng Qiu
Cesar Cadena
Sebastian Scherer
Marco Hutter
Wenshan Wang
Institution
ETH Zurich
Carnegie Mellon Unviersity

Citation

            @article{patel2025tartanground,
      title={TartanGround: A Large-Scale Dataset for Ground Robot Perception and Navigation},
      author={Patel, Manthan and Yang, Fan and Qiu, Yuheng and Cadena, Cesar and Scherer, Sebastian and Hutter, Marco and Wang, Wenshan},
      journal={arXiv preprint arXiv:2505.10696},
      year={2025}}
        

Example usage

Similar datasets



Clione is an open repository for transparent dataset sourcing, supporting responsible research in robotics and machine learning.
Our mission is to make finding and understanding datasets easy and intutive.

About FAQs Contact