Publication
Robust multimodal and multi-object tracking for autonomous driving applications
Conference Article
Conference
International Conference on Advanced Robotics (ICAR)
Edition
2023
Pages
100-106
Doc link
http://dx.doi.org/10.1109/ICAR58858.2023.10406433
File
Authors
Projects associated
Abstract
In this work, we present a method for Multi-Object Tracking (MOT) that uses unsynchronized multimodal detections from a configurable set of sensors such as cameras, radars and lidars. All the information is processed from the sensors with modality-specific detectors and then combined in the MOT module that incorporates a Kalman filter and tracklet management logic. To be able to deploy our system in real-world driving applications, we handle localization errors, misclassifications and partial bounding-box detections from the object detector in the MOT module. We show promising results and compare them with respect to competing approaches in two challenging real-world scenarios, including a traffic jam chauffeur as well as a traffic monitoring application on highways.
Categories
mobile robots.
Author keywords
Multiple Object Tracking, Sensor Fusion.
Scientific reference
M. Pérez and A. Agudo. Robust multimodal and multi-object tracking for autonomous driving applications, 2023 International Conference on Advanced Robotics, 2023, Abu Dhabi, UAE, pp. 100-106, IEEE.
Follow us!