2023
DOI: 10.3390/su151410828
|View full text |Cite
|
Sign up to set email alerts
|

Transformer Architecture-Based Transfer Learning for Politeness Prediction in Conversation

Abstract: Politeness is an essential part of a conversation. Like verbal communication, politeness in textual conversation and social media posts is also stimulating. Therefore, the automatic detection of politeness is a significant and relevant problem. The existing literature generally employs classical machine learning-based models like naive Bayes and Support Vector-based trained models for politeness prediction. This paper exploits the state-of-the-art (SOTA) transformer architecture and transfer learning for respe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 25 publications
0
1
0
Order By: Relevance
“…By evaluating the performance metrics and accuracy rates of each model, we can gain a better understanding of the advancements and limitations in the field of obesity prediction using machine learning techniques. Moreover, the proposed model can be efficiently used for other applications of deep learning and machine learning other than obesity [30][31][32][33][34][35][36][37][38].…”
Section: Comparative Analysis Of Resultsmentioning
confidence: 99%
“…By evaluating the performance metrics and accuracy rates of each model, we can gain a better understanding of the advancements and limitations in the field of obesity prediction using machine learning techniques. Moreover, the proposed model can be efficiently used for other applications of deep learning and machine learning other than obesity [30][31][32][33][34][35][36][37][38].…”
Section: Comparative Analysis Of Resultsmentioning
confidence: 99%