Whose Track Is It Anyway? Improving Robustness to Tracking Errors with Affinity-based Trajectory Prediction

Xinshuo Weng1,3, Boris Ivanovic3, Kris Kitani1 Marco Pavone2,3

1Robotics Institute, Carnegie Mellon University
2Department of Aeronautics and Astronautics, Stanford University
3NVIDIA Research

IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR), 2022


Multi-agent trajectory prediction is critical for planning and decision-making in human-interactive autonomous systems, such as self-driving cars. However, most prediction models are developed separately from their upstream perception (detection and tracking) modules, assuming ground truth past trajectories as inputs. As a result, their performance degrades significantly when using real-world noisy tracking results as inputs. This is typically caused by the propagation of errors from tracking to prediction, such as noisy tracks, fragments and identity switches. To alleviate this propagation of errors, we propose a new prediction pipeline that only requires detections and their affinity matrices across frames as inputs, entirely removing the need for error-prone data association during tracking. Since affinity matrices contain ``soft" information about the similarity and identity of detections across frames, predicting directly from them retains strictly more information than using the tracklets generated by data association in tracking. Thorough experiments on large-scale, real-world autonomous driving datasets show that our affinity-based prediction scheme reduces overall prediction errors by up to 60.2\%, in comparison to standard prediction pipelines that use tracklets as inputs, with even more significant error reduction (up to 88.2\%) if restricting the evaluation to challenging scenarios with tracking errors.



author = {Weng, Xinshuo and Ivanovic, Boris and Kitani, Kris and Pavone, Marco}, 
journal = {CVPR}, 
title = {{Whose Track Is It Anyway? Improving Robustness to Tracking Errors with Affinity-Based Prediction}},
year = {2022} 

Page Views since 03/28/2022