Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2014
DOI: 10.3115/v1/d14-1080
|View full text |Cite
|
Sign up to set email alerts
|

Opinion Mining with Deep Recurrent Neural Networks

Abstract: Recurrent neural networks (RNNs) are connectionist models of sequential data that are naturally applicable to the analysis of natural language. Recently, "depth in space" -as an orthogonal notion to "depth in time" -in RNNs has been investigated by stacking multiple layers of RNNs and shown empirically to bring a temporal hierarchy to the architecture. In this work we apply these deep RNNs to the task of opinion expression extraction formulated as a token-level sequence-labeling task. Experimental results show… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
248
1
1

Year Published

2015
2015
2021
2021

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 338 publications
(251 citation statements)
references
References 15 publications
1
248
1
1
Order By: Relevance
“…On the contrary,İrsoy and Cardie implement an end-to-end sequence labeler for opinion extraction with SRNN [2]. It takes unsupervised trained word vectors as input and outperforms previous approaches.…”
Section: Opinion Expression Extractionmentioning
confidence: 99%
See 1 more Smart Citation
“…On the contrary,İrsoy and Cardie implement an end-to-end sequence labeler for opinion extraction with SRNN [2]. It takes unsupervised trained word vectors as input and outperforms previous approaches.…”
Section: Opinion Expression Extractionmentioning
confidence: 99%
“…Recent natural language processing (NLP) works leverage word embeddings to encode syntactic and semantic information. Composing such embeddings through a deep recurrent neural network has been proven to be an efficient way to model the interactions between words.İrsoy and Cardie conducted a comprehensive research on utilizing the Elman network, also known as a Simple Recurrent Neural Network (SRNN), in opinion expression extraction and achieved state-of-the-art performance [2].…”
Section: Introductionmentioning
confidence: 99%
“…Since 2014 the SemEval workshop included a shared task on the topic (Pontiki et al, 2014), which has also encouraged the development of new supervised methods. We find approaches based on CRFs such as Mitchell et al (2013) and deep learning (Irsoy and Cardie, 2014) (Liu et al, 2015a), (Zhang et al, 2015).…”
Section: Related Workmentioning
confidence: 99%
“…CNN and RNN display high quality in sentiment analysis in short texts [44] and opinion mining tasks [45].…”
Section: ав мельников дс ботов юд кленинmentioning
confidence: 99%