2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019
DOI: 10.1109/cvpr.2019.00503
|View full text |Cite
|
Sign up to set email alerts
|

Contrastive Adaptation Network for Unsupervised Domain Adaptation

Abstract: Unsupervised Domain Adaptation (UDA) makes predictions for the target domain data while manual annotations are only available in the source domain. Previous methods minimize the domain discrepancy neglecting the class information, which may lead to misalignment and poor generalization performance. To address this issue, this paper proposes Contrastive Adaptation Network (CAN) optimizing a new metric which explicitly models the intra-class domain discrepancy and the inter-class domain discrepancy. We design an … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
488
0
5

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 836 publications
(493 citation statements)
references
References 46 publications
0
488
0
5
Order By: Relevance
“…In recent years, deep neural networks have been proved effectively in domain adaptation, and the existing methods based on deep neural network can be roughly divided into the following categories. The first category is based on discrepancy, i.e., the discrepancy between the features which are extracted from the source and target domains should be as small as possible, where the commonly used functions for measuring feature discrepancy are shown as follows: Maximum Mean Discrepancy (MMD) [3][4], Joint Maximum Mean Discrepancy (JMMD) [5], Weighted Maximum Mean Discrepancy (WMMD) [6], Wasserstein discrepancy [7], Sliced Wasserstein Discrepancy (SWD) [8], Orthogonal Discrepancy [9], Correlation Discrepancy [10][11][12], Source-Guided Discrepancy (SGD) [13], Contrastive Domain Discrepancy (CDD) [14] and pseudo-label differences [15][16]. Besides the marginal distributions, the output class distributions are also considered in domain adaptation [17].…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, deep neural networks have been proved effectively in domain adaptation, and the existing methods based on deep neural network can be roughly divided into the following categories. The first category is based on discrepancy, i.e., the discrepancy between the features which are extracted from the source and target domains should be as small as possible, where the commonly used functions for measuring feature discrepancy are shown as follows: Maximum Mean Discrepancy (MMD) [3][4], Joint Maximum Mean Discrepancy (JMMD) [5], Weighted Maximum Mean Discrepancy (WMMD) [6], Wasserstein discrepancy [7], Sliced Wasserstein Discrepancy (SWD) [8], Orthogonal Discrepancy [9], Correlation Discrepancy [10][11][12], Source-Guided Discrepancy (SGD) [13], Contrastive Domain Discrepancy (CDD) [14] and pseudo-label differences [15][16]. Besides the marginal distributions, the output class distributions are also considered in domain adaptation [17].…”
Section: Related Workmentioning
confidence: 99%
“…2) How to use multi-domain information and reduce the discrepancy between 2D images and 3D objects in a unified framework. Traditional domain adaptation works focus on single modality (e.g., 2D images) and obtain impressive performances [2,18,46,50]. Recently, there are some works paying attention to cross-domain retrieval, e.g., sketch-based 3D object retrieval [4,22,48] and image-based 3D object retrieval [24].…”
Section: Motivationmentioning
confidence: 99%
“…Transferrable Prototypical Networks (TPN) is presented for adaptation such that the prototypes for each class in the two domains are close in the embedding space and the score distributions predicted by prototypes separately on source and target data are similar [33]. Contrastive Adaptation Network (CAN) is proposed to optimize a new metric which explicitly models the intra-class domain discrepancy and the inter-class domain discrepancy [34]. A dynamic Bayesian network (DBN)-based fault diagnosis methodology in the presence of TF and IF for electronic systems is proposed [35].…”
Section: Introductionmentioning
confidence: 99%