Publication

Prediction stability as a criterion in active learning

Conference Article

Conference

International Conference on Artificial Neural Networks (ICANN)

Edition

2020

Pages

157-167

Doc link

https://doi.org/10.1007/978-3-030-61616-8_13

File

Download the digital copy of the doc pdf document

Authors

Abstract

Recent breakthroughs made by deep learning rely heavily on a large number of annotated samples. To overcome this shortcoming, active learning is a possible solution. Besides the previous active learning algorithms that only adopted information after training, we propose a new class of methods named sequential-based method based on the information during training. A specific criterion of active learning called prediction stability is proposed to prove the feasibility of sequential-based methods. We design a toy model to explain the principle of our proposed method and pointed out a possible defect of the former uncertainty-based methods. Experiments are made on CIFAR-10 and CIFAR-100, and the results indicates that prediction stability was effective and works well on fewer-labeled datasets. Prediction stability reaches the accuracy of traditional acquisition functions like entropy on CIFAR-10, and notably outperformed them on CIFAR-100.

Categories

pattern recognition.

Author keywords

deep learning, active learning, classification, sequential-based, prediction stability

Scientific reference

J. Liu, X. Li, J. Zhou and J. Shen. Prediction stability as a criterion in active learning, 2020 International Conference on Artificial Neural Networks, 2020, Bratislava, Slovakia, in Artificial Neural Networks and Machine Learning – ICANN 2020, Vol 12397 of Lecture Notes in Computer Science, pp. 157-167, 2020, Springer, Cham, Switzerland.