2022
DOI: 10.48550/arxiv.2201.07459
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

PT4AL: Using Self-Supervised Pretext Tasks for Active Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
36
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(37 citation statements)
references
References 0 publications
1
36
0
Order By: Relevance
“…Like classical active learning, deep active learning uses a variety of criteria for selecting data points. Uncertainty sampling is a major direction here as well [Gal et al, 2017, Yi et al, 2022, Ren et al, 2021, Zhan et al, 2022. A well-known state-of-the-art method in this context is MC Dropout [Gal et al, 2017], which uses dropout at test time in order to create variants of an initially trained deep neural network model; averaging predictions over these model variants then yields estimates of class probabilities that can be used by standard acquisition functions.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…Like classical active learning, deep active learning uses a variety of criteria for selecting data points. Uncertainty sampling is a major direction here as well [Gal et al, 2017, Yi et al, 2022, Ren et al, 2021, Zhan et al, 2022. A well-known state-of-the-art method in this context is MC Dropout [Gal et al, 2017], which uses dropout at test time in order to create variants of an initially trained deep neural network model; averaging predictions over these model variants then yields estimates of class probabilities that can be used by standard acquisition functions.…”
Section: Related Workmentioning
confidence: 99%
“…Some methods combine self-supervised or semisupervised methods with standard active learning methods. For instance, the method by [Yi et al, 2022] performs a pretext task (like rotation prediction) over the pool of unlabeled data points, which is then divided into batches based on the performance on the pretext task. Then, uncertainty-based active sampling is performed over one batch.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Another critical aspect is balancing uncertainty and diversity to label a more reasonable distribution of samples. Existing methods, such as PT4AL [18], employ equidistant disjoint sub-pools, but determining the size of each sub-pool poses challenges. A small sub-pool has a greater impact on the order of CDDs, limiting the choice of the uncertainty-based sampler.…”
Section: Introductionmentioning
confidence: 99%