2023
DOI: 10.1109/taffc.2023.3239540
|View full text |Cite
|
Sign up to set email alerts
|

Cross-Domain Aspect-Based Sentiment Classification by Exploiting Domain- Invariant Semantic-Primary Feature

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 58 publications
0
3
0
Order By: Relevance
“…According to Zhang et al [17], transformer-based semantic-primary knowledge transferring (TSPKT) network for aspect-term sentiment analysis across domains makes use of semantic-primary information to facilitate the transfer of knowledge across multiple domains. Before semantic-primary information is collected from the S-Graph, it is generated using external semantic lexicons.…”
Section: Related Workmentioning
confidence: 99%
“…According to Zhang et al [17], transformer-based semantic-primary knowledge transferring (TSPKT) network for aspect-term sentiment analysis across domains makes use of semantic-primary information to facilitate the transfer of knowledge across multiple domains. Before semantic-primary information is collected from the S-Graph, it is generated using external semantic lexicons.…”
Section: Related Workmentioning
confidence: 99%
“…Liang et al [9] developed an embedding refinement framework targeted for ABSA, furthering the understanding of how sophisticated embedding techniques can enhance sentiment analysis accuracy. In addressing the challenges of cross-domain analysis, Zhang et al [13] exploited domain-invariant semantic-primary features for cross-domain ABSA, highlighting the importance of domain adaptability in sentiment analysis models. Cao et al [15] implemented a heterogeneous reinforcement learning network with external knowledge for ABSA, demonstrating the integration of external knowledge sources for improving model performance.…”
Section: Literature Reviewmentioning
confidence: 99%
“…in which LN and FF are the normalization and feed-forward layers of the BERT model. Zhang et al (2023a) developed the Transformer-based Semantic-Primary Knowledge Transferring network model (TSPKT) considering syntactic relationships and generalizing semantic relations. They argue that high-level abstractions of aspects and opinion terms help in cross-domain scenarios.…”
Section: Rulesmentioning
confidence: 99%