2021
DOI: 10.48550/arxiv.2102.06679
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Adversarial Branch Architecture Search for Unsupervised Domain Adaptation

Abstract: Unsupervised Domain Adaptation (UDA) is a key field in visual recognition, as it enables robust performances across different visual domains. In the deep learning era, the performance of UDA methods has been driven by better losses and by improved network architectures, specifically the addition of auxiliary domain-alignment branches to pre-trained backbones. However, all the neural architectures proposed so far are hand-crafted, which might hinder further progress.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
6
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(7 citation statements)
references
References 29 publications
1
6
0
Order By: Relevance
“…Different from the existing literature, in this paper, we investigate the possibility and an effective solution to search for optimal architectures towards better domain adaptation. Concurrently, we find that Li et al [26] and Robbiano et al [40] have investigated a similar topic: the generalization abilities of architectures cross domains. The differences are two-fold: 1) we propose a novel search space for diverse attention configurations, which is different from the search space of AdaptNAS [26] (akin to NASNet [57]) and ABAS [40] (just change the architecture of the auxiliary adversarial branch); 2) we focus on an effective NAS protocol to search for optimal attention configuration towards UDA tasks.…”
Section: Related Worksupporting
confidence: 66%
See 3 more Smart Citations
“…Different from the existing literature, in this paper, we investigate the possibility and an effective solution to search for optimal architectures towards better domain adaptation. Concurrently, we find that Li et al [26] and Robbiano et al [40] have investigated a similar topic: the generalization abilities of architectures cross domains. The differences are two-fold: 1) we propose a novel search space for diverse attention configurations, which is different from the search space of AdaptNAS [26] (akin to NASNet [57]) and ABAS [40] (just change the architecture of the auxiliary adversarial branch); 2) we focus on an effective NAS protocol to search for optimal attention configuration towards UDA tasks.…”
Section: Related Worksupporting
confidence: 66%
“…Differences from Concurrent Methods. Concurrently, Liet al [26] and Robbianoet al [40] also propose to seek for better transferable network architectures. The differences between our EvoADA and them are: 1) on the design of search space: AdaptNAS [26] adopted the search space of NASNet [57] that includes basic operations in CNNs (e.g., different pooling or convolution operations) and seeks for optimal structures in a cell perspective, ABAS [40] only changes the structure of the auxiliary branch.…”
Section: Module Attention Onmentioning
confidence: 99%
See 2 more Smart Citations
“…Li and Peng (2020) propose a DARTS-like method for DA, which combines DARTS and DA into one framework. Robbiano et al (2021) aim to learn a auxiliary branch network from data for an adversarial DA method. In this paper, different from those works, we aim to leverage NAS to search optimal neural architectures for the proposed DAMPC method.…”
Section: Neural Architecture Search For Domain Adaptationmentioning
confidence: 99%