Publication
Bootstrapping boosted random Ferns for discriminative and efficient object classification
Journal Article (2012)
Journal
Pattern Recognition
Pages
3141-3153
Volume
45
Number
9
Doc link
http://dx.doi.org/10.1016/j.patcog.2012.03.025
File
Authors
Projects associated
MIPRCV: CONSOLIDER-INGENIO 2010 Multimodal interaction in pattern recognition and computer vision
PAU: Percepción y acción ante incertidumbre
RobTaskCoop: Cooperación robots humanos en áreas urbanas
PAU+: Perception and Action in Robotics Problems with Large State Spaces
ARCAS: Aerial Robotics Cooperative Assembly System
Abstract
In this paper we show that the performance of binary classifiers based on Boosted Random Ferns can be significantly improved by appropriately bootstrapping the training step. This results in a classifier which is both highly discriminant and computationally efficient and is particularly suitable when only small sets of training images are available. During the learning process, a small set of labeled images is used to train the boosting binary classifier. The classifier is then evaluated over the training set and warped versions of the classified and misclassified patches are progressively added into the positive and negative sample sets for a new retraining step. In this paper we thoroughly study the conditions under which this bootstrapping scheme improves the detection rates. In particular we assess the quality of detection both as a function of the number of bootstrapping iterations and the size of the training set. We compare our algorithm against state-of-the-art approaches for several databases including
faces, cars, motorbikes and horses, and show remarkable improvements in detection rates with just a few bootstrapping steps.
Categories
pattern recognition.
Author keywords
object detection, boosting, boostraping, random ferns
Scientific reference
M. Villamizar, J. Andrade-Cetto, A. Sanfeliu and F. Moreno-Noguer. Bootstrapping boosted random Ferns for discriminative and efficient object classification. Pattern Recognition, 45(9): 3141-3153, 2012.
Follow us!