Publication

Integration of dependent Bayesian filters for robust tracking

Conference Article

Conference

IEEE International Conference on Robotics and Automation (ICRA)

Edition

2006

Pages

4081-4087

Doc link

http://dx.doi.org/10.1109/ROBOT.2006.1642329

File

Download the digital copy of the doc pdf document

Abstract

Robotics applications based on computer vision algorithms are highly constrained to indoor environments where conditions may be controlled. The development of robust visual algorithms is necessary for improving the capabilities of many autonomous systems in outdoor and dynamic environments. In particular, this paper proposes a tracking algorithm robust to several artifacts which may be found in real world applications, such as lighting changes, cluttered backgrounds and unexpected target movements. In order to deal with these difficulties the proposed tracking methodology integrates several Bayesian filters. Each filter estimates the state of a particular object feature which is conditionally dependent on another feature estimated by a distinct filter. This dependence provides improved representations of the target, allowing to segment it out from the background of the image. We describe the updating procedure of the Bayesian filters by a ‘hypotheses generation and correction’ scheme. The main difference with respect to previous approaches is that the dependence between filters is considered during the feature observation, i.e, into the ‘hypotheses correction’ stage, instead of considering it when generating the hypotheses. This proves to be much more effective in terms of accuracy and reliability.

Categories

computer vision, object detection.

Author keywords

Bayesian methods, object detection

Scientific reference

F. Moreno-Noguer, A. Sanfeliu and D. Samaras. Integration of dependent Bayesian filters for robust tracking, 2006 IEEE International Conference on Robotics and Automation, 2006, Orlando, FL, USA, pp. 4081-4087, IEEE.