2019
DOI: 10.1371/journal.pone.0222713
|View full text |Cite
|
Sign up to set email alerts
|

Self Multi-Head Attention-based Convolutional Neural Networks for fake news detection

Abstract: With the rapid development of the internet, social media has become an essential tool for getting information, and attracted a large number of people join the social media platforms because of its low cost, accessibility and amazing content. It greatly enriches our life. However, its rapid development and widespread also have provided an excellent convenience for the range of fake news, people are constantly exposed to fake news and suffer from it all the time. Fake news usually uses hyperbole to catch people’… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 46 publications
(15 citation statements)
references
References 20 publications
0
15
0
Order By: Relevance
“…", "Why is mankind afraid of death?"). It might originate in a frequent usage of rhetorical questions to emphasize the ideas consciously and intensify the sentiment [11]. Another interesting observation is found in users' depiction of suicidal tendencies.…”
Section: Data Analysis Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…", "Why is mankind afraid of death?"). It might originate in a frequent usage of rhetorical questions to emphasize the ideas consciously and intensify the sentiment [11]. Another interesting observation is found in users' depiction of suicidal tendencies.…”
Section: Data Analysis Resultsmentioning
confidence: 99%
“…As the next step, the pooled feature maps are flattened through a reshape function to make the feature vector pulls concatenated. Flattening = pooled.reshape (11) The above equation takes rows and appends them all to create a single column vector.…”
Section: Convolutional Layermentioning
confidence: 99%
See 1 more Smart Citation
“…Another study [9] used self multihead attention-based CNN (SMHACNN). e study implemented CNN and self multihead attention (SMHA) techniques and evaluated the truthfulness of news based on its content.…”
Section: Related Studiesmentioning
confidence: 99%
“…Self-attention weights from such transformer models are usually structured in a 3-dimensional matrix. Using convolutional neural networks with these self-attention weight matrices can be helpful for extracting semantic features for downstream NLP tasks (Fang et al, 2019). Similar approaches can be used to extract features for detecting counterfactuals in "Thanks for the article on this new term that fits me so well, wish all your articles were worthy of praise."…”
Section: Introductionmentioning
confidence: 99%