2019 IEEE First International Conference on Cognitive Machine Intelligence (CogMI) 2019
DOI: 10.1109/cogmi48466.2019.00025
|View full text |Cite
|
Sign up to set email alerts
|

A2Text-Net: A Novel Deep Neural Network for Sarcasm Detection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
3
2

Relationship

2
8

Authors

Journals

citations
Cited by 37 publications
(19 citation statements)
references
References 31 publications
0
19
0
Order By: Relevance
“…Chatterjee et al [50] took the context of the utterance into consideration, and proposed a deep learning based approach. Liu et al [51] proposed a deep neural network, called A2Text-Net, to mimic the face-to-face speech, which integrated auxiliary clues such as punctuations, part of speech (POS), emoji, etc., to improve the performance of sarcasm detection.…”
Section: A Sarcasm Analysismentioning
confidence: 99%
“…Chatterjee et al [50] took the context of the utterance into consideration, and proposed a deep learning based approach. Liu et al [51] proposed a deep neural network, called A2Text-Net, to mimic the face-to-face speech, which integrated auxiliary clues such as punctuations, part of speech (POS), emoji, etc., to improve the performance of sarcasm detection.…”
Section: A Sarcasm Analysismentioning
confidence: 99%
“…To create a dataset, they selected sarcastic and non-sarcastic videos. [33] proposed a new deep neural network method for sarcasm detection known as A2Text-Net. They integrated the auxiliary variables like POS, emoji, punctuations and numerals etc., to enhance the performance of sarcasm detection classifier.…”
Section: Literature Reviewmentioning
confidence: 99%
“…With the advent of deep-learning, recent works [ 5 , 6 , 7 , 8 , 9 ], leverage neural networks to learn both lexical and contextual features, eliminating the need for hand-crafted features. In these works, word embeddings are incorporated to train deep convolutional, recurrent, or attention-based neural networks to achieve state-of-the-art results on multiple large-scale datasets.…”
Section: Introductionmentioning
confidence: 99%