2021
DOI: 10.24200/sci.2021.51486.2210
|View full text |Cite
|
Sign up to set email alerts
|

Domain Adaptation via Bregman divergence minimization

Abstract: In recent years, the Fisher linear discriminant analysis (FLDA) based classification models are among the most successful approaches and have shown effective performance in different classification tasks. However, when the learning data (source domain) have a different distribution compared with the testing data (target domain), the FLDA-based models may not work well, and the performance degrades, dramatically. To face this issue, we offer an optimal domain adaptation via Bregman divergence minimization (DAB)… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 37 publications
0
1
0
Order By: Relevance
“…Subsequently, some UDA methods take advantage of label correction to mitigate the disruptive impact of unreliable pseudo-labeled samples. As a result, pseudo-labeling methods can be categorized into two paradigms: with label correction [27,28] and without label correction [29]. In pseudo labeling without label correction, we infer asymmetric tri-training for unsupervised domain adaptation [30], in which three classifiers are trained asymmetrically, while two of them are trained with labeled source samples and used for pseudo labeling, and the third one is trained with pseudo-labeled samples.…”
Section: Methods With Pseudo Labelingmentioning
confidence: 99%
“…Subsequently, some UDA methods take advantage of label correction to mitigate the disruptive impact of unreliable pseudo-labeled samples. As a result, pseudo-labeling methods can be categorized into two paradigms: with label correction [27,28] and without label correction [29]. In pseudo labeling without label correction, we infer asymmetric tri-training for unsupervised domain adaptation [30], in which three classifiers are trained asymmetrically, while two of them are trained with labeled source samples and used for pseudo labeling, and the third one is trained with pseudo-labeled samples.…”
Section: Methods With Pseudo Labelingmentioning
confidence: 99%