2023
DOI: 10.21203/rs.3.rs-3156168/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Fake News Detection Using a Logistic Regression Model and Natural Language Processing Techniques

Johnson Adeleke Adeyiga,
Philip Gbounmi Toriola,
Temitope Elizabeth Abioye(Ogunbiyi)
et al.

Abstract: The proliferation of fake news has become a significant challenge in recent years, impacting democracy, the journalism industry, and people's daily lives. The spread of intentionally misleading or fabricated information has led to a decline in confidence in government institutions and has profound implications for people's daily lives. This study aims to detect false information and real news using logistic regression algorithms and natural language processing techniques, implement the model using Python, and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…Figures 3 and 4 summarize the settings and the training steps of the language models. Machine Learning classifiers deployed, trained, and tested in this work were Logistic regression [30], Support Vector Machine (SVM) [31], K-nearest neighbors (KNN) [32], Decision Tree [33], Stochastic Gradient Descent (SGD) [34], and Multinomial Naive Bayes [35]. In the ensemble learning category, several models were applied to do the same task such as Voting Classifiers [36], Random Forest [37], Bagging Meta-Estimator [38], AdaBoost [39], XGBoost [40], Gradient Boosting [41], and Light Gradient Boosting Machine (LightGBM) [42].…”
Section: The Proposed Approachmentioning
confidence: 99%
“…Figures 3 and 4 summarize the settings and the training steps of the language models. Machine Learning classifiers deployed, trained, and tested in this work were Logistic regression [30], Support Vector Machine (SVM) [31], K-nearest neighbors (KNN) [32], Decision Tree [33], Stochastic Gradient Descent (SGD) [34], and Multinomial Naive Bayes [35]. In the ensemble learning category, several models were applied to do the same task such as Voting Classifiers [36], Random Forest [37], Bagging Meta-Estimator [38], AdaBoost [39], XGBoost [40], Gradient Boosting [41], and Light Gradient Boosting Machine (LightGBM) [42].…”
Section: The Proposed Approachmentioning
confidence: 99%