2021
DOI: 10.1109/access.2021.3057880
|View full text |Cite
|
Sign up to set email alerts
|

A Mutually Auxiliary Multitask Model With Self-Distillation for Emotion-Cause Pair Extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(1 citation statement)
references
References 48 publications
0
0
0
Order By: Relevance
“…For Inter-EC [4], Shan and Zhu [31] designed a new cause extraction component based on transformer [32] to improve this model. Yu et al [33] applied the self-distillation method to train a mutually auxiliary multitask model. Jia et al [34] realized mutual promotion of emotion extraction and cause extraction by recursively modeling clauses.…”
Section: Ecpe 221 Pipelined Ecpementioning
confidence: 99%
“…For Inter-EC [4], Shan and Zhu [31] designed a new cause extraction component based on transformer [32] to improve this model. Yu et al [33] applied the self-distillation method to train a mutually auxiliary multitask model. Jia et al [34] realized mutual promotion of emotion extraction and cause extraction by recursively modeling clauses.…”
Section: Ecpe 221 Pipelined Ecpementioning
confidence: 99%