2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) 2022
DOI: 10.1109/wacv51458.2022.00209
|View full text |Cite
|
Sign up to set email alerts
|

The Hitchhiker’s Guide to Prior-Shift Adaptation

Abstract: In many computer vision classification tasks, class priors at test time often differ from priors on the training set. In the case of such prior shift, classifiers must be adapted correspondingly to maintain close to optimal performance. This paper analyzes methods for adaptation of probabilistic classifiers to new priors and for estimating new priors on an unlabeled test set. We propose a novel method to address a known issue of prior estimation methods based on confusion matrices, where inconsistent estimates… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…The results show that in all cases, prior shift adaptation improves the recognition accuracy. The EM algorithm of Saerens et al ( 2002 ) achieves the best result in three cases, the CM-L method of Sipka et al ( 2022 ) in one case, but the differences are very small among the three compared prior shift adaptation methods.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…The results show that in all cases, prior shift adaptation improves the recognition accuracy. The EM algorithm of Saerens et al ( 2002 ) achieves the best result in three cases, the CM-L method of Sipka et al ( 2022 ) in one case, but the differences are very small among the three compared prior shift adaptation methods.…”
Section: Resultsmentioning
confidence: 99%
“…We test existing prior shift adaptation methods and their impact on classification accuracy. The experiments with state-of-theart methods for prior shift estimation (Saerens et al, 2002;Sipka et al, 2022), evaluated in Table 6, show that all three compared methods improve the classification accuracy in all cases. The differences among all three methods are rather small, EM achieving slightly better results in 3 of 4 cases.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The model tries to approximate the true class priors p(y|x) by learning from more representative training data or using different optimization techniques. However, in practical evaluation scenarios, the prior probabilities may differ from that of the training set and may even change from one domain to another, which is often called prior shift or label shift [141].…”
Section: A Posterior Adaptationmentioning
confidence: 99%
“…where the ratio p(y) p(y) implies the prior shift. In [141], authors propose the test-time adaptation of a fixed pretrained classifier after a prior shift happens by re-weighting its predictions based on confusion matrices. To avoid over-confident predictions due to overfitting to some classes, [142] propose to calibrate the confidence of classifier predictions by adding class-specific bias terms:…”
Section: A Posterior Adaptationmentioning
confidence: 99%