2019
DOI: 10.48550/arxiv.1911.00202
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Forget Me Not: Reducing Catastrophic Forgetting for Domain Adaptation in Reading Comprehension

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…In future work, we intend to explore various methods to improve the performance of T+DAPT by remedying catastrophic forgetting and maximizing knowledge transfer. For this we hope to emulate the regularization used by Xu et al (2020) and implement multi-task learning and continual learning methods like AdapterNet (Hazan et al, 2018). In order to improve the transferability of learned features, we will explore different auxiliary tasks such as NLI and sentiment analysis in addition to few-shot learning approaches.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…In future work, we intend to explore various methods to improve the performance of T+DAPT by remedying catastrophic forgetting and maximizing knowledge transfer. For this we hope to emulate the regularization used by Xu et al (2020) and implement multi-task learning and continual learning methods like AdapterNet (Hazan et al, 2018). In order to improve the transferability of learned features, we will explore different auxiliary tasks such as NLI and sentiment analysis in addition to few-shot learning approaches.…”
Section: Discussionmentioning
confidence: 99%
“…We train our own DAPT baselines on the Movies and COVID-19 domains. Xu et al (2020) explore methods to reduce catastrophic forgetting during language model fine-tuning. They apply topic modeling on the MS MARCO dataset (Bajaj et al, 2018) to generate 6 narrow domain-specific data sets, from which we use BioQA and MoviesQA as domain-specific reading comprehension benchmarks.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…To deal with this issue, He et al (2019) proposed to replay pre-train data during fine-tuning heavily, and Chen et al (2020) proposed an improved Adam optimizer to recall knowledge captured during pretraining. The catastrophic forgetting issue is also noticed in domain adaptation setups for neural machine translation (Saunders et al, 2019;Thompson et al, 2019;Varis and Bojar, 2019) and the reading comprehension task (Xu et al, 2019). Lee (2017) firstly studied the continual learning setting for dialog state tracking in task-oriented dialog systems.…”
Section: Introductionmentioning
confidence: 99%