Trajectory fusion for multiple camera tracking

Conference Article


International Conference on Computer Recognition Systems (CORES)





Doc link


Download the digital copy of the doc pdf document


In this paper we present a robust and efficient method to overcome the negative effects of occlusion in the tracking process of multiple agents. The proposed approach is based on the matching of multiple trajectories from multiple views using spatial and temporal information. These trajectories are represented as consecutive points of a joint ground plane in the world coordinate system that belong to the same tracked agent. We introduce an integral distance between compared trajectories, which allows us to avoid mismatches, due to the possible measurement outliers in one frame. The proposed method can also be considered as an interpolation algorithm of a disconnected trajectory during the time of occlusion. This technique solves one of the most difficult problems of occlusion handling, which is a matching of two unconnected parts of the same trajectory.


computer vision.

Scientific reference

A. Amato, M. Al Haj, M. Mozerov and J. Gonzàlez. Trajectory fusion for multiple camera tracking, 5th International Conference on Computer Recognition Systems, 2007, Wroclaw, Polònia, in Computer Recognition Systems 2, Vol 45 of Advances in Soft Computing, pp. 19-26, 2007, Springer.