TARS: Traffic-Aware Radar Scene Flow Estimation
Jialong Wu1,2 • Marco Braun2 • Dominic Spata2 • Matthias Rottmann1
1 Department of Mathematics, University of Wuppertal, Wuppertal, Germany
2 Aptiv Services Deutschland GmbH
IEEE/CVF International Conference on Computer Vision (ICCV), 2025
Abstract
Scene flow provides crucial motion information for autonomous driving. Recent LiDAR scene flow models utilize the rigid-motion assumption at the instance level, assuming objects are rigid bodies. However, these instance-level methods are not suitable for sparse radar point clouds. In this work, we present a novel Traffic-Aware Radar Scene flow estimation method, named TARS, which utilizes the motion rigidity at the traffic level. To address the challenges in radar scene flow, we perform object detection and scene flow jointly and boost the latter. We incorporate the feature map from the object detector, trained with detection losses, to make radar scene flow aware of the environment and road users. Therefrom, we construct a Traffic Vector Field (TVF) in the feature space, enabling a holistic traffic-level scene understanding in our scene flow branch. When estimating the scene flow, we consider both point-level motion cues from point neighbors and traffic-level consistency of rigid motion within the space. TARS outperforms the state of the art on a proprietary dataset and the View-of-Delft dataset, improving the benchmarks by 23% and 15%, respectively.
Resources & Links
Acknowledgments
J.W. and M.R. acknowledge support by the German Federal Ministry of Education and Research within the junior research group project “UnrEAL” (grant no. 01IS22069).
Citation
@article{wu2025tars,
title={TARS: Traffic-Aware Radar Scene Flow Estimation},
author={Wu, Jialong and Braun, Marco and Spata, Dominic and Rottmann, Matthias},
journal={arXiv preprint arXiv:2503.10210},
year={2025}
}
Contact
Have questions or want to collaborate? Reach out:
- Email: jialong.wu@uni-wuppertal.de