2020 2nd International Conference on Electrical, Control and Instrumentation Engineering (ICECIE) 2020
DOI: 10.1109/icecie50279.2020.9309565
|View full text |Cite
|
Sign up to set email alerts
|

A Review of Light Gradient Boosting Machine Method for Hate Speech Classification on Twitter

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…LightGBM, short for Light Gradient Boosting Machine, is a gradient-boosting framework that uses tree-based learning algorithms. Proposed by [24] to solve the issue of the Gradient Boosting Decision Tree (GBDT) in conventional implementations, it is based on two novel techniques, Gradient-Based One-Side Sampling (GOSS) and Exclusive Feature Bundling (EFB) [49]. In [24], the experiment demonstrated that LigthGBM can accelerate the training process by up to over 20 times.…”
Section: Overview Of Machine Learning Modelsmentioning
confidence: 99%
“…LightGBM, short for Light Gradient Boosting Machine, is a gradient-boosting framework that uses tree-based learning algorithms. Proposed by [24] to solve the issue of the Gradient Boosting Decision Tree (GBDT) in conventional implementations, it is based on two novel techniques, Gradient-Based One-Side Sampling (GOSS) and Exclusive Feature Bundling (EFB) [49]. In [24], the experiment demonstrated that LigthGBM can accelerate the training process by up to over 20 times.…”
Section: Overview Of Machine Learning Modelsmentioning
confidence: 99%
“…The LightGBM method in classifying hate speech consisted of one thousand data samples, with 70% of the dataset being training data and 30% being evaluation data. An accuracy of 86.05% was achieved [12]. Aishwarya Mujumdar and Dr. Vaidehi V. (2019) created a pipelined model for diabetes classi cation and prediction.…”
Section: Related Workmentioning
confidence: 99%
“…A different method develops trees parallel to the ground, whereas LGBM produces trees upwardly, or, to put it another way, LGBM produces trees leaf by leaf, whereas another method produces trees level by level. The leaf with the greatest delta erosion will be produced [32]. This study used the default parameter for LGBM classifiers.…”
Section: Lightgbm Classifiermentioning
confidence: 99%