URScenes: A Multi-scenario Dataset for Unstructured Road Environments
Abstract
As autonomous driving technology transitions from small-scale validation to large-scale deployment, its development in unstructured road environments has become a critical and inevitable trend. Autonomous vehicles increasingly rely on high-quality and diverse datasets for perception systems. However, existing public datasets predominantly focus on clear-weather and urban-road scenarios, leaving a significant gap in the coverage of unstructured road environments. To bridge this gap, we construct URScenes, the first multi-scenario, open-source perception dataset for unstructured road environments. The dataset consists of 472 scenes, each lasting 30 seconds, and provides over 28K annotated samples and 119K sweeps. URScenes, for the first time, covers eight typical scenarios, including rainy, snowy, foggy, dusty, glare, night, cloudy, and sunny conditions. Additionally, URScenes supports multi-task perception for 3D object detection, multi-object tracking, and 3D occupancy in unstructured road environments. URScenes also provides a unified annotation system and format conversion tools, enabling easy conversion to popular formats such as NuScenes, KITTI, and Waymo dataset. Finally, this study presents comparative experimental results to assess the performance of state-of-the-art algorithms on the URScenes dataset. The data, development toolkit, and additional information are available online.