2023
DOI: 10.36227/techrxiv.22215403
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Fake News Detection by Fine Tuning of Bidirectional Encoder Representations from Transformers

Abstract: <p>Everyone now has internet and social media access, making it simple to get information and news. On the other hand, there are fake news articles, also. It not only makes difficult for the public to find their truthfulness but also misleads them. Consequently, developing intelligent systems for separating news is critical. In this paper, four deep learning techniques such as Bidirectional Encoder Representations from Transformers (BERT), Long Short-Term Memory (LSTM), Bidirectional Long Short-Term Memo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 22 publications
0
1
0
Order By: Relevance
“…The results show that the GPT embeddings with machine learning models perfomed better than fine tuning of BERT(Bidirectional Encoder Representations from Transformers). In the paper [20] it is reported that the BERT fine tuning on Gossipcop dataset have produced the accuracy of around 85%. For the same dataset the proposed method performed better with around 87% accuracy.…”
Section: B Experiments Ii: Sentiment Analysismentioning
confidence: 99%
“…The results show that the GPT embeddings with machine learning models perfomed better than fine tuning of BERT(Bidirectional Encoder Representations from Transformers). In the paper [20] it is reported that the BERT fine tuning on Gossipcop dataset have produced the accuracy of around 85%. For the same dataset the proposed method performed better with around 87% accuracy.…”
Section: B Experiments Ii: Sentiment Analysismentioning
confidence: 99%