2021
DOI: 10.1007/s10489-021-02580-3
|View full text |Cite
|
Sign up to set email alerts
|

An attention network via pronunciation, lexicon and syntax for humor recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…Zhang [13] interactively modeled the semantic information of setups and punchlines in humorous texts at both word and clause levels to predict the humor level. Ren [14] used CNN, LSTM [15] and Attention to encode the text from pronunciation, lexicon and syntax perspectives to recognize humor.…”
Section: Related Workmentioning
confidence: 99%
“…Zhang [13] interactively modeled the semantic information of setups and punchlines in humorous texts at both word and clause levels to predict the humor level. Ren [14] used CNN, LSTM [15] and Attention to encode the text from pronunciation, lexicon and syntax perspectives to recognize humor.…”
Section: Related Workmentioning
confidence: 99%
“…In recent years, Deep Neural Networks (DNNs) have been employed. To give an example, Chen and Soo [28] build a Convolutional Neural Network (CNN), while Ren et al [35] propose an approach based on Long Short-Term Memorys (LSTMs) and an attention mechanism. With the advent of large pretrained transformer language models like BERT [36], the focus of textual humour detection shifted towards such models, motivated by their promising performance in many NLP downstream tasks.…”
Section: Textual Humour Recognitionmentioning
confidence: 99%
“…In their study, Ren et al (2022) addressed jokes recognition by focusing on the linguistics elements. This study pointed out the attention to the aspects of pronunciation, lexicon, and syntax for jokes recognition.…”
Section: Introductionmentioning
confidence: 99%