Robust multimodal and multi-object tracking for autonomous driving applications

Conference Article


International Conference on Advanced Robotics (ICAR)



Doc link


Download the digital copy of the doc pdf document


In this work, we present a method for Multi-Object Tracking (MOT) that uses unsynchronized multimodal detections from a configurable set of sensors such as cameras, radars and lidars. All the information is processed from the sensors with modality-specific detectors and then combined in the MOT module that incorporates a Kalman filter and tracklet management logic. To be able to deploy our system in real-world driving applications, we handle localization errors, misclassifications and partial bounding-box detections from the object detector in the MOT module. We show promising results and compare them with respect to competing approaches in two challenging real-world scenarios, including a traffic jam chauffeur as well as a traffic monitoring application on highways.


mobile robots.

Author keywords

Multiple Object Tracking, Sensor Fusion.

Scientific reference

M. Pérez and A. Agudo. Robust multimodal and multi-object tracking for autonomous driving applications, 2023 International Conference on Advanced Robotics, 2023, Abu Dhabi, UAE, IEEE, to appear.