Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing 2022
DOI: 10.18653/v1/2022.emnlp-main.212
|View full text |Cite
|
Sign up to set email alerts
|

COM-MRC: A COntext-Masked Machine Reading Comprehension Framework for Aspect Sentiment Triplet Extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(1 citation statement)
references
References 0 publications
0
0
0
Order By: Relevance
“…Their expanded versions DECNN (Xu et al, 2018;Chen and Qian, 2020) and BERT-PT (Xu et al, 2019;Wang et al, 2021; are generally adopted as backbones in the subsequent studies. In addition, BERT is utilized as the pedestal for context-aware encoding in a series of more complex tasks, including Aspect-Sentiment Triplet Extraction (ASTE) (Chen et al, 2022a;Zhang et al, 2022b;Chen et al, 2022c;Zhang et al, 2022a;Chen et al, 2022d;Zhao et al, 2022b) and MRC-based ASTE (Yang and Zhao, 2022;Zhai et al, 2022). Recently, the generative framework is introduced into the studies of ASTE, and accordingly BART (Yan et al, 2021;Zhao et al, 2022a) and T5 Hu et al, 2022) are used.…”
Section: Related Workmentioning
confidence: 99%
“…Their expanded versions DECNN (Xu et al, 2018;Chen and Qian, 2020) and BERT-PT (Xu et al, 2019;Wang et al, 2021; are generally adopted as backbones in the subsequent studies. In addition, BERT is utilized as the pedestal for context-aware encoding in a series of more complex tasks, including Aspect-Sentiment Triplet Extraction (ASTE) (Chen et al, 2022a;Zhang et al, 2022b;Chen et al, 2022c;Zhang et al, 2022a;Chen et al, 2022d;Zhao et al, 2022b) and MRC-based ASTE (Yang and Zhao, 2022;Zhai et al, 2022). Recently, the generative framework is introduced into the studies of ASTE, and accordingly BART (Yan et al, 2021;Zhao et al, 2022a) and T5 Hu et al, 2022) are used.…”
Section: Related Workmentioning
confidence: 99%