2023
DOI: 10.3233/jifs-221214
|View full text |Cite
|
Sign up to set email alerts
|

Combining BERT with TCN-BiGRU for enhancing Arabic aspect category detection

Abstract: Aspect-based sentiment analysis (ABSA) is a challenging task of sentiment analysis that aims at extracting the discussed aspects and identifying the sentiment corresponding to each aspect. We can distinguish three main ABSA tasks: aspect term extraction, aspect category detection (ACD), and aspect sentiment classification. Most Arabic ABSA research has relied on rule-based or machine learning-based methods, with little attention to deep learning techniques. Moreover, most existing Arabic deep learning models a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…The second task allows the model to predict whether a sentence is the next sentence in a given sequence of sentences. The BERT model has improved the results of many NLP tasks including named entity recognition [33], [34] text classification [35], [36] and sentiment analysis [37], [38]. Figure 2 illustrates the architecture of the BERT model.…”
Section: Bidirectional Encoder Representations From Transformers Modelmentioning
confidence: 99%
“…The second task allows the model to predict whether a sentence is the next sentence in a given sequence of sentences. The BERT model has improved the results of many NLP tasks including named entity recognition [33], [34] text classification [35], [36] and sentiment analysis [37], [38]. Figure 2 illustrates the architecture of the BERT model.…”
Section: Bidirectional Encoder Representations From Transformers Modelmentioning
confidence: 99%
“…Logging data at depths ranging from 1621 to 1710 m are used as the training data to build the models, including the back propagation neural network (BPNN) [20], GRU [21], BiGRU-AM [22], and ISCSO-TCN-BiGRU-AM. The training models take six parameters as inputs: DEPTH, CAL, NPHI, GR, DT, and RT.…”
Section: Model Parameter Settingmentioning
confidence: 99%
“…Other transformer models like generative pre-trained transformer (GPT) and efficiently learning an encoder that classifies token replacements accurately (ELECTRA) are under-explored in this area. Moreover, the impact of incorporating these models with more complex neural network layers has been investigated in many tasks like aspect term extraction [24], named entity recognition [25], and aspect category detection [26]. Yet further efforts are required to explore the effectiveness of this combined approach for the task of hate speech detection.…”
Section: Introductionmentioning
confidence: 99%