2023
DOI: 10.32604/cmc.2023.037112
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Task Learning Model with Data Augmentation for Arabic Aspect-Based Sentiment Analysis

Abstract: Aspect-based sentiment analysis (ABSA) is a fine-grained process. Its fundamental subtasks are aspect term extraction (ATE) and aspect polarity classification (APC), and these subtasks are dependent and closely related. However, most existing works on Arabic ABSA content separately address them, assume that aspect terms are preidentified, or use a pipeline model. Pipeline solutions design different models for each task, and the output from the ATE model is used as the input to the APC model, which may result i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 71 publications
0
1
0
Order By: Relevance
“…The experimental results demonstrated that the model exceeded the baseline model, which relied on conditional random fields (CRF) with features extracted using named entity recognition (NER), POS tagging, parsing, semantic analysis, and other recently proposed models such as AraBERT, MarBERT, and CamelBERT-MSA. [43] proposed a multi-task learning approach called local context focus-aspect term extraction and polarity classification (LCF-ATEPC) and AraBERT as a shared layer for Arabic contextual text representation to accomplish T1 and T2 simultaneously. The reference hotel and product review datasets were used.…”
Section: ) Deep Learning Approachesmentioning
confidence: 99%
“…The experimental results demonstrated that the model exceeded the baseline model, which relied on conditional random fields (CRF) with features extracted using named entity recognition (NER), POS tagging, parsing, semantic analysis, and other recently proposed models such as AraBERT, MarBERT, and CamelBERT-MSA. [43] proposed a multi-task learning approach called local context focus-aspect term extraction and polarity classification (LCF-ATEPC) and AraBERT as a shared layer for Arabic contextual text representation to accomplish T1 and T2 simultaneously. The reference hotel and product review datasets were used.…”
Section: ) Deep Learning Approachesmentioning
confidence: 99%