2023
DOI: 10.1016/j.procs.2022.12.188
|View full text |Cite
|
Sign up to set email alerts
|

BERT base model for toxic comment analysis on Indonesian social media

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0
2

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(7 citation statements)
references
References 7 publications
0
5
0
2
Order By: Relevance
“…Seperti yang sudah dijabarkan di atas mengenai analisis perubahan makna dalam sosial media khususnya Instagram dan X. Platform media sosial di Indonesia mendorong terjadinya fenomena perubahan makna dengan hadirnya teknologi sehingga tercipta kebebasan dalam menyampaikan ide, kritik, saran, dan sebagainya yang tertuang pada komentar media sosial (Nabiilah et. al., 2023).…”
Section: Perubahan Makna Menjadi Lebih Burukunclassified
See 1 more Smart Citation
“…Seperti yang sudah dijabarkan di atas mengenai analisis perubahan makna dalam sosial media khususnya Instagram dan X. Platform media sosial di Indonesia mendorong terjadinya fenomena perubahan makna dengan hadirnya teknologi sehingga tercipta kebebasan dalam menyampaikan ide, kritik, saran, dan sebagainya yang tertuang pada komentar media sosial (Nabiilah et. al., 2023).…”
Section: Perubahan Makna Menjadi Lebih Burukunclassified
“…al., 2023). Terkadang komentar yang tertuang tersebut menimbulkan perdebatan kecil di antara netizen karena sering kali komentar yang muncul banyak mengandung makna baru yang rancu atau belum diketahui secara luas (Nabiilah et. al., 2023).…”
Section: Perubahan Makna Menjadi Lebih Burukunclassified
“…Research category classification of scientific articles on human health risks of electromagnetic fields using pre-trained BERT [267]. BERT base model for toxic comment analysis on Indonesian social media [277]. Comparison of BERT implementations for natural language processing of narrative medical documents [296].…”
Section: Fig 5 Ai Components For Nlpmentioning
confidence: 99%
“…These models hold great promise for enhancing text-based interactions and insights. [245], [180], [188], [251], [261], [264], [271], [272], [274], [276], [277], [278], [279], [218], [288], [296], [297], [236], [299], [238] GPT Variants…”
Section: Fig 5 Ai Components For Nlpmentioning
confidence: 99%
“…The optimal result of this study is using BERT on word embedding and classification with an F1 value of 84% in multi-class classification. Another study by Nabiilah et al [10] also used a pre-trained model with BERT architecture. This study compared several pre-trained models that had been trained with Indonesian language corpus data, namely multilingual BERT (MBERT), Indonesia BERT (IndoBERT), and Indonesia robustly optimized BERT pretraining approach (IndoRoBERTa) small.…”
Section: Introductionmentioning
confidence: 99%