2021
DOI: 10.1007/978-3-030-71704-9_65
|View full text |Cite
|
Sign up to set email alerts
|

A Brief Review of Domain Adaptation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
92
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 312 publications
(131 citation statements)
references
References 48 publications
0
92
0
Order By: Relevance
“…Examples are BERTje and RobBERT [19] [3], the Dutch equivalents of BERT and RoBERTa, respectively), and CamemBERT [14], a French BERT model. Domain adaptation, a special case of transfer learning where the model is first trained on unsupervised data from the domain of an intended task, aims to improve results even further "by minimizing the difference between the domain distributions" ( [6], p. 1), thus creating a model that optimally learns from the training data. Regarding the domain of COVID-19, COVID-Twitter-BERT [16], a BERT-large model pre-trained on COVID-19 tweets, has shown statistically significant gains over the baseline BERT-large in various applications, including vaccine stance classification.…”
Section: Related Researchmentioning
confidence: 99%
“…Examples are BERTje and RobBERT [19] [3], the Dutch equivalents of BERT and RoBERTa, respectively), and CamemBERT [14], a French BERT model. Domain adaptation, a special case of transfer learning where the model is first trained on unsupervised data from the domain of an intended task, aims to improve results even further "by minimizing the difference between the domain distributions" ( [6], p. 1), thus creating a model that optimally learns from the training data. Regarding the domain of COVID-19, COVID-Twitter-BERT [16], a BERT-large model pre-trained on COVID-19 tweets, has shown statistically significant gains over the baseline BERT-large in various applications, including vaccine stance classification.…”
Section: Related Researchmentioning
confidence: 99%
“…As a solution adopted from transfer learning method, Domain Adaptation (DA) [108] tries to reduce disparity between source and target domain, improving model's performance on unseen data. This technique is applicable when the training data does not accurately have the same distribution as test data which is quite often in real-world applications and results in performance degradation.…”
Section: Domain Adaptationmentioning
confidence: 99%
“…Previous works perform disentanglement using paraphrase pairs as information for semantics, and/or constituency parses as information for syntax. The 1 github.com/ghazi-f/QKVAE dependence of models on labeled data is known to entail high cost (see Seddah et al, 2020 on syntactic annotation), and to often require new labels to handle problems such as concept drift (Lu et al, 2019) and domain adaptation (Farahani et al, 2021).…”
Section: Introductionmentioning
confidence: 99%