Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1563
|View full text |Cite
|
Sign up to set email alerts
|

A Knowledge Regularized Hierarchical Approach for Emotion Cause Analysis

Abstract: Emotion cause analysis, which aims to identify the reasons behind emotions, is a key topic in sentiment analysis. A variety of neural network models have been proposed recently, however, these previous models mostly focus on the learning architecture with local textual information, ignoring the discourse and prior knowledge, which play crucial roles in human text comprehension. In this paper, we propose a new method to extract emotion cause with a hierarchical neural model and knowledge-based regularizations, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
26
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 60 publications
(31 citation statements)
references
References 27 publications
0
26
0
Order By: Relevance
“…Xia et al [31] used the Transformer to encode clauses, which achieved excellent performance. In order to overcome the lack of training data, Fan et al [32] and Hu et al [33] coincidentally introduced external emotion knowledge on the basis of hierarchical design to improve the accuracy of the model. In addition, inspired by methods on other NLP tasks, researchers have made a lot of new attempts.…”
Section: A Ece and Ecpementioning
confidence: 99%
“…Xia et al [31] used the Transformer to encode clauses, which achieved excellent performance. In order to overcome the lack of training data, Fan et al [32] and Hu et al [33] coincidentally introduced external emotion knowledge on the basis of hierarchical design to improve the accuracy of the model. In addition, inspired by methods on other NLP tasks, researchers have made a lot of new attempts.…”
Section: A Ece and Ecpementioning
confidence: 99%
“…adopted Transformer encoder augmented with position information and integrated global prediction embedding to improve performance. Fan et al (2019) incorporated sentiment and position regularizers to restrain parameter learning. Hu et al (2019) exploited external sentiment classification corpus to pretrain the model.…”
Section: Related Workmentioning
confidence: 99%
“…Sentiment words with their polarity are widely used for sentiment analysis, including sentencelevel sentiment classification (Taboada et al, 2011;Shin et al, 2017;Lei et al, 2018;Barnes et al, 2019), aspect-level sentiment classification (Vo and Zhang, 2015), opinion extraction (Li and Lam, 2017), emotion analysis (Gui et al, 2017;Fan et al, 2019) and so on. Lexicon-based method (Turney, 2002;Taboada et al, 2011) directly utilizes polarity of sentiment words for classification.…”
Section: Related Workmentioning
confidence: 99%
“…There are many specific sentiment tasks, and these tasks usually depend on different types of sentiment knowledge including sentiment words, word polarity and aspect-sentiment pairs. The importance of these knowledge has been verified by tasks at different level, for instance, sentence-level sentiment classification (Taboada et al, 2011;Shin et al, 2017;Lei et al, 2018), aspect-level sentiment classification (Vo and Zhang, 2015;Zeng et al, 2019), opinion extraction (Li and Lam, 2017;Gui et al, 2017;Fan et al, 2019) and so on. Therefore, we assume that, by integrating these knowledge into the pre-training process, the learned representation would be more sentimentspecific and appropriate for sentiment analysis.…”
Section: Introductionmentioning
confidence: 99%