2021 IEEE/CVF International Conference on Computer Vision (ICCV) 2021
DOI: 10.1109/iccv48922.2021.00914
|View full text |Cite
|
Sign up to set email alerts
|

Influence Selection for Active Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(9 citation statements)
references
References 20 publications
0
9
0
Order By: Relevance
“…Taking our method as an example, we use the classifier trained on the training set to infer unlabeled sample data, and then select unlabeled data samples as candidate samples to label information through the inference results. However, whether the classifier in this process is biased, it biases the categories that have been recognized skillfully and ignores the samples that have not been skillfully classified in the sample selection stage 32 . The root cause of the data bias problem is the preference for partial data in the sampling mechanism itself.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Taking our method as an example, we use the classifier trained on the training set to infer unlabeled sample data, and then select unlabeled data samples as candidate samples to label information through the inference results. However, whether the classifier in this process is biased, it biases the categories that have been recognized skillfully and ignores the samples that have not been skillfully classified in the sample selection stage 32 . The root cause of the data bias problem is the preference for partial data in the sampling mechanism itself.…”
Section: Discussionmentioning
confidence: 99%
“…However, whether the classifier in this process is biased, it biases the categories that have been recognized skillfully and ignores the samples that have not been skillfully classified in the sample selection stage. 32 The root cause of the data bias problem is the preference for partial data in the sampling mechanism itself.…”
Section: Sbd Mechanismmentioning
confidence: 99%
“…To improve the sampling bias of any baseline sampling method Settles and Craven propose an information density method which is computationally expensive for large pools of unlabeled data [4]. Several other successful methods have been proposed to combat the sampling bias and robustness issues, but all these methods are designed to work with specific model types, e.g., with convolutional neural networks [12], [13], [15], [16], [28]. A generalizable method for reducing sampling bias is proposed by Elhamifar et al [17], but it involves an optimization problem with n 2 variables where n is the number of datapoints in the data set, so the method is not scalable to large data sets.…”
Section: A Active Learningmentioning
confidence: 99%
“…Both of these methods are only applicable to convolutional neural networks. Liu et al propose a influence based selection method, but the data point that has most influence on the gradient of the models' loss function may not necessarily improve the performance of the model the most, and could be an outlier [21].…”
Section: Active Learning Literature Reviewmentioning
confidence: 99%