Skip to yearly menu bar Skip to main content


Poster

TAD-E2E: A Large-scale End-to-end Autonomous Driving Dataset

Chang Liu · mingxuzhu mingxuzhu · Zheyuan Zhang · Linna Song · xiao zhao · Luo Qingliang · Qi Wang · Chufan Guo · Kuifeng Su


Abstract:

End-to-end autonomous driving technology has recently become a focal point of research and application in autonomous driving. State-of-the-art (SOTA) methods are often trained and evaluated on the NuScenes dataset. However, the NuScenes dataset, introduced in 2019 for 3D perception tasks, faces several limitations—such as insufficient scale, simple scenes, and homogeneous driving behaviors—that restrict the upper-bound development of end-to-end autonomous driving algorithms. In light of these issues, we propose a novel, large-scale real-world dataset specifically designed for end-to-end autonomous driving tasks, named TAD-E2E, which is 25x larger, 1.7x scene complexity over NuScenes, and features a highly diverse range of driving behaviors. We replicated SOTA methods on the TAD-E2E dataset and observed that these methods no longer performed well, as expected. Additionally, in response to the challenging scenarios presented in the TAD-E2E dataset, we devised a multimodal sparse end-to-end method that significantly outperforms SOTA methods. Ablation studies demonstrate the effectiveness of our method, and we analyze the contributions of each module. The dataset and code will be made open source upon acceptance of the paper.

Live content is unavailable. Log in and register to view live content