Findings of the Association for Computational Linguistics: EMNLP 2021 2021
DOI: 10.18653/v1/2021.findings-emnlp.62
|View full text |Cite
|
Sign up to set email alerts
|

Learn Continually, Generalize Rapidly: Lifelong Knowledge Accumulation for Few-shot Learning

Abstract: The ability to continuously expand knowledge over time and utilize it to rapidly generalize to new tasks is a key feature of human linguistic intelligence. Existing models that pursue rapid generalization to new tasks (e.g., fewshot learning methods), however, are mostly trained in a single shot on fixed datasets, unable to dynamically expand their knowledge; while continual learning algorithms are not specifically designed for rapid generalization. We present a new learning setup, Continual Learning of Few-Sh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
13
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 15 publications
(14 citation statements)
references
References 51 publications
1
13
0
Order By: Relevance
“…To improve upon these methods, we introduce a Continual Learning of Few‐Shot Learners (CLIF) that aims to tackle the challenges posed by CL and FSL within a unified framework using a base transformer (Jin et al. 2021). CLIF is designed to address the learning process where a model sequentially learns from a diverse range of NLP tasks.…”
Section: Continual Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…To improve upon these methods, we introduce a Continual Learning of Few‐Shot Learners (CLIF) that aims to tackle the challenges posed by CL and FSL within a unified framework using a base transformer (Jin et al. 2021). CLIF is designed to address the learning process where a model sequentially learns from a diverse range of NLP tasks.…”
Section: Continual Learningmentioning
confidence: 99%
“…Transformers are much larger compared to prior models such as convolutional neural networks and using weight regularization or experience replay on the full model may not lead to desired results. To improve upon these methods, we introduce a Continual Learning of Few-Shot Learners (CLIF) that aims to tackle the challenges posed by CL and FSL within a unified framework using a base transformer (Jin et al 2021). CLIF is designed to address the learning process where a model sequentially learns from a diverse range of NLP tasks.…”
Section: Continual Learningmentioning
confidence: 99%
“…Systems in (Sun et al, 2020a,b) incrementally learned among five disparate NLP tasks. Jin et al (2021) further extended the size of the task stream (one benchmark has 26 tasks, the other covers 55) and studied TCL in a few-shot scenario. It is worth mentioning that all the listed work in TCL consistently transformed all tasks into question answering format (as pointed out in (McCann et al, 2018), many NLP tasks can be formulated as question answering), thus TCL in these literature was actually converted into DCL.…”
Section: Related Workmentioning
confidence: 99%
“…Similar with (Xia et al, 2021;Jin et al, 2021), our work also focuses on low-resource continual learning; in contrast, our learning problem belongs and Cohen, 1989), and never-ending learning (Carlson et al, 2010). to TCL while each task in our formulation is expressed by instructions instead of labeled examples.…”
Section: Related Workmentioning
confidence: 99%
“…The primary challenge addressed in continual learning literature is overcoming catastrophic forgetting (French, 1999;Biesialska et al, 2020;Wu et al, 2022), Various approaches have been proposed to tackle the forgetting problem, e.g., rehearsal-based methods (Han et al, 2020;de Masson d'Autume et al, 2019;, regularization-based methods (Li et al, 2019;Huang et al, 2021), and dynamic architecture methods (Ke et al, 2021;. Continual few-shot learning is an even more challenging yet realistic setting which encourages learners the quick adaptation ability during learning (Jin et al, 2021;Yoon et al, 2020). Comparing to the numerous researches out of NLP applications (Yap et al, 2021;Yoon et al, 2020;Dong et al, 2021), continual few-shot language learning is still an under-explored area (Jin et al, 2021).…”
Section: Vae-dpriormentioning
confidence: 99%