Proceedings of the 12th International Workshop on Semantic Evaluation 2018
DOI: 10.18653/v1/s18-1037
|View full text |Cite
|
Sign up to set email alerts
|

NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer Learning

Abstract: In this paper we present deep-learning models that submitted to the SemEval-2018 Task 1 competition: "Affect in Tweets". We participated in all subtasks for English tweets. We propose a Bi-LSTM architecture equipped with a multi-layer self attention mechanism. The attention mechanism improves the model performance and allows us to identify salient words in tweets, as well as gain insight into the models making them more interpretable. Our model utilizes a set of word2vec word embeddings trained on a large coll… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
50
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 90 publications
(50 citation statements)
references
References 39 publications
0
50
0
Order By: Relevance
“…According to recent research by Young et al [4], deep learning methods are currently the state-of-the-art results in many domains. This trend was confirmed by the number of entries in the recent SemEval-2018 competition [5], where the winning entries in the English and Spanish [6,7] multi-classification competition all made use of neural networks. Indeed, many of the top-performing entries made use of a special kind of neural network known as "long short-term memory", either in conjunction with attention mechanisms [8,9], word embeddings [10,11] or convolutional neural networks [12,13].…”
Section: Introductionmentioning
confidence: 74%
See 2 more Smart Citations
“…According to recent research by Young et al [4], deep learning methods are currently the state-of-the-art results in many domains. This trend was confirmed by the number of entries in the recent SemEval-2018 competition [5], where the winning entries in the English and Spanish [6,7] multi-classification competition all made use of neural networks. Indeed, many of the top-performing entries made use of a special kind of neural network known as "long short-term memory", either in conjunction with attention mechanisms [8,9], word embeddings [10,11] or convolutional neural networks [12,13].…”
Section: Introductionmentioning
confidence: 74%
“…Emotion classification tasks have traditionally been tackled by obtaining a large corpus, constructing feature sets, pre-processing in sophisticated ways and then making use of any number of black box training algorithms [14,15]. Indeed, this approach is still prevalent currently [6,7,16]. We demonstrate that with a small-sized corpus and without using any black box algorithms, our results are comparable with systems that have been trained on millions (and sometimes billions) of tweets.…”
Section: Introductionmentioning
confidence: 75%
See 1 more Smart Citation
“…Baziotis et al [33], the winner of the multi-label emotion classification task of SemEval-2018 Task1: Affect in Tweets, developed a bidirectional Long Short-Term Memory (LSTM) with a deep attention mechanism. They trained a word2vec model with 800,000 words derived from a dataset of 550 million tweets.…”
Section: Emotion Classification In Tweetsmentioning
confidence: 99%
“…• NTUA-SLP: the system submitted by the winner team of the SemEval-2018 Task1:E-cchallenge[33].• TCS: the system submitted by the second place winner[34]. • PlusEmo2Vec: the system submitted by the third place winner[35].•Transformer: a deep learning system based on large pre-trained language models developed by the NVIDIA AI lab[39].…”
mentioning
confidence: 99%