Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence 2022
DOI: 10.24963/ijcai.2022/407
|View full text |Cite
|
Sign up to set email alerts
|

Function-words Adaptively Enhanced Attention Networks for Few-Shot Inverse Relation Classification

Abstract: Existing unsupervised domain adaptation (UDA) studies focus on transferring knowledge in an offline manner. However, many tasks involve online requirements, especially in real-time systems. In this paper, we discuss Online UDA (OUDA) which assumes that the target samples are arriving sequentially as a small batch. OUDA tasks are challenging for prior UDA methods since online training suffers from catastrophic forgetting which leads to poor generalization. Intuitively, a good memory is a crucial factor in the s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(9 citation statements)
references
References 0 publications
0
9
0
Order By: Relevance
“…We conducted our experiments on two large-scale benchmark datasets. The experimental results show that our model outperforms all baseline models, especially the recent state-of-the-art model [3]. Further studies demonstrate our ability to mitigate the similar relation problem effectively.…”
Section: Introductionmentioning
confidence: 76%
See 3 more Smart Citations
“…We conducted our experiments on two large-scale benchmark datasets. The experimental results show that our model outperforms all baseline models, especially the recent state-of-the-art model [3]. Further studies demonstrate our ability to mitigate the similar relation problem effectively.…”
Section: Introductionmentioning
confidence: 76%
“…However, these models are devoted to learning each class's prototype representations individually, without considering the higher-order interactions between the different classes. Recently, FAEA [3] explores constructing an instance-level graph in the support set. However, they mainly focus on intra-class function word detection, without considering the impact of high-order interactions, limiting the task's performance.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…For few-shot inverse relation classification, FAEA [265] employs a hybrid attention model to attend class-related words based on meta-learning. It leverages function-words enhanced attention to effectively compute representations for both support and query instances, facilitating message passing and similarity calculation between queries and prototypes.…”
Section: Few-shot Inverse Relation Classificationmentioning
confidence: 99%