2016 23rd International Conference on Pattern Recognition (ICPR) 2016
DOI: 10.1109/icpr.2016.7900034
|View full text |Cite
|
Sign up to set email alerts
|

Active learning using uncertainty information

Abstract: Abstract-Many active learning methods belong to the retraining-based approaches, which select one unlabeled instance, add it to the training set with its possible labels, retrain the classification model, and evaluate the criteria that we base our selection on. However, since the true label of the selected instance is unknown, these methods resort to calculating the average-case or worse-case performance with respect to the unknown label. In this paper, we propose a different method to solve this problem. In p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 45 publications
(31 citation statements)
references
References 18 publications
0
31
0
Order By: Relevance
“…Rubens and Sugiyama [20] study influence-based collaborative active learning, where multiple users may work together to provide a result, such as a group-based recommendation. Yang and Loog [21] also study active learning using uncertain information. With regards to understanding deep neural network structures, Liu et al [22] look at understanding the training process of deep generative models, Ming et al [23] focus on understanding the hidden memories of recurrent neural networks, and Kahng et al [24] propose ActiVis as a visual analytics tool for exploring large deep neural networks.…”
Section: Related Workmentioning
confidence: 99%
“…Rubens and Sugiyama [20] study influence-based collaborative active learning, where multiple users may work together to provide a result, such as a group-based recommendation. Yang and Loog [21] also study active learning using uncertain information. With regards to understanding deep neural network structures, Liu et al [22] look at understanding the training process of deep generative models, Ming et al [23] focus on understanding the hidden memories of recurrent neural networks, and Kahng et al [24] propose ActiVis as a visual analytics tool for exploring large deep neural networks.…”
Section: Related Workmentioning
confidence: 99%
“…Uncertainty-based metrics are widely deployed in the AL field as the literature suggests [ 34 ], mainly due to their strong performance in terms of calculation efficiency and effectiveness in the process of selecting the most confusing instances. On the other hand, in the SSL field research works exist [ 8 , 35 ] proving the effectiveness of probabilistic iterative schemes.…”
Section: Proposed Methodsmentioning
confidence: 99%
“…The min-max view active learning directly measures the value of objective function during retraining procedure and selects the instance with minimum score in the worst case scenario. Recently, Yang & Loog (2016) proposed to improve the retraining-based algorithms by integrating the uncertainty information in the selection criterion.…”
Section: Related Workmentioning
confidence: 99%