2021
DOI: 10.48550/arxiv.2112.03271
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Adapting BERT for Continual Learning of a Sequence of Aspect Sentiment Classification Tasks

Abstract: This paper studies continual learning (CL) of a sequence of aspect sentiment classification (ASC) tasks. Although some CL techniques have been proposed for document sentiment classification, we are not aware of any CL work on ASC. A CL system that incrementally learns a sequence of ASC tasks should address the following two issues: (1) transfer knowledge learned from previous tasks to the new task to help it learn a better model, and (2) maintain the performance of the models for previous tasks so that they ar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(9 citation statements)
references
References 7 publications
0
9
0
Order By: Relevance
“…BERT can take one or two sentences as input and differentiate them using the special token [SEP]. The [CLS] token, which is unique to classification tasks, always appears at the beginning of the text 17 .…”
Section: Proposed Systemmentioning
confidence: 99%
See 2 more Smart Citations
“…BERT can take one or two sentences as input and differentiate them using the special token [SEP]. The [CLS] token, which is unique to classification tasks, always appears at the beginning of the text 17 .…”
Section: Proposed Systemmentioning
confidence: 99%
“…Adapter-BERT outperforms fine-tuned BERT in terms of performance. Figure 2 illustrates the architecture of adapter-BERT 17,18 .…”
Section: Proposed Systemmentioning
confidence: 99%
See 1 more Smart Citation
“…B-CL replaces adapters with CLA, containing a knowledge-sharing module (KSM) and taskspecific module (TSM), both with skip-connections. Image and modified caption are taken from (Ke et al, 2021b).…”
Section: Model Updating Challengesmentioning
confidence: 99%
“…For example, if we consider the BERT layer in Figure 3, we can observe that the adapter layers for the MTL approach in Figure 3(A) are in the same positions as the CL adapters in Figure 3(B) (Ke et al, 2021b). To switch from MTL to CL we only need to change adapters.…”
Section: Model Updating Challengesmentioning
confidence: 99%