2022 International Joint Conference on Neural Networks (IJCNN) 2022
DOI: 10.1109/ijcnn55064.2022.9892200
|View full text |Cite
|
Sign up to set email alerts
|

Continual Unsupervised Domain Adaptation for Semantic Segmentation using a Class-Specific Transfer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 30 publications
0
6
0
Order By: Relevance
“…Another line of work has explored diverse setups in terms of domain availability. Some propose to handle multiple source [58], [59] or target [17], [18], [19], [60], [61], [62], [63] domains. This can involve a single adaptation phase [58], [59], or multiple phases where different domains are experienced in different learning steps in an incremental fashion (but still with a fixed class set) [17], [18], [19], [62], [63], in fact, undertaking continual learning under the domain adaptation perspective.…”
Section: Related Workmentioning
confidence: 99%
“…Another line of work has explored diverse setups in terms of domain availability. Some propose to handle multiple source [58], [59] or target [17], [18], [19], [60], [61], [62], [63] domains. This can involve a single adaptation phase [58], [59], or multiple phases where different domains are experienced in different learning steps in an incremental fashion (but still with a fixed class set) [17], [18], [19], [62], [63], in fact, undertaking continual learning under the domain adaptation perspective.…”
Section: Related Workmentioning
confidence: 99%
“…Most methods in this category store information about a specific domain's style in order to transform source images into the styles of the target domains during training. Recent work achieves this by storing low-frequency components of the images for every domain [14], by capturing the style of the domain with generative models [15], [16] or by using a domain-specific memory to mitigate forgetting [17]. Supervised domain-incremental learning is rarely considered [11].…”
Section: A Continual Semantic Segmentationmentioning
confidence: 99%
“…CDA has the goal of adapting a model that is trained on a supervised source dataset to a sequence of different domains, for which no labels are provided. However, in order to compensate for the missing labels of the target domains, the model has access to the initial source dataset throughout the entire training sequence [36]. Methods in this category work mostly by storing information about the style of the specific domains, so that during training the source images can be transferred into the styles of the different target domains.…”
Section: Continual Unsupervised Domain Adaptationmentioning
confidence: 99%
See 2 more Smart Citations