Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.510
|View full text |Cite
|
Sign up to set email alerts
|

Learning to Decouple Relations: Few-Shot Relation Classification with Entity-Guided Attention and Confusion-Aware Training

Abstract: This paper aims to enhance the few-shot relation classification especially for sentences that jointly describe multiple relations. Due to the fact that some relations usually keep high cooccurrence in the same context, previous few-shot relation classifiers struggle to distinguish them with few annotated instances. To alleviate the above relation confusion problem, we propose CTEG, a model equipped with two mechanisms to learn to decouple these easily-confused relations. On the one hand, an Entity-Guided Atten… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(11 citation statements)
references
References 24 publications
0
11
0
Order By: Relevance
“…This article compares the BMAN with the following baselines: MLMAN ( Ye & Ling, 2019 ): This approach interactively encodes support instances with query instances, and the matching process strengthens the connection between the query set and the relational prototype. CTEG ( Wang et al, 2020 ): The model uses entity-guided attention mechanisms and confusion-aware training to distinguish between problems of relational confusion. TPN(BERT) ( Wen et al, 2021 ): The method integrates the transformer model into the prototype network and uses the pre-trained BERT as the encoder for the model.…”
Section: Methodsmentioning
confidence: 99%
“…This article compares the BMAN with the following baselines: MLMAN ( Ye & Ling, 2019 ): This approach interactively encodes support instances with query instances, and the matching process strengthens the connection between the query set and the relational prototype. CTEG ( Wang et al, 2020 ): The model uses entity-guided attention mechanisms and confusion-aware training to distinguish between problems of relational confusion. TPN(BERT) ( Wen et al, 2021 ): The method integrates the transformer model into the prototype network and uses the pre-trained BERT as the encoder for the model.…”
Section: Methodsmentioning
confidence: 99%
“…Accuracy is adopted as the evaluation metric Accuracy = TP + TN TP + FP + FN + TN (15) where TP + TN is the number of queries correctly classified, and TP + FP + FN + TN are the number of all queries.…”
Section: Methodsmentioning
confidence: 99%
“…The model aimed to classify query instances, and sought basic knowledge about supporting examples to obtain a better example representation. Wang et al [15] proposed the CTEG model, which is trained by entity-guided attention and is confusion-aware to decouple easily confused relations. Optimization-based methods follow the idea of differentiating an optimization process over support-set within the meta-learning framework.…”
Section: Few-shot Relation Classificationmentioning
confidence: 99%
“…In order to alleviate the problem of insufficient training data, MICK (Geng et al, 2020) learns general language rules and grammatical knowledge from cross-domain datasets. Wang (Wang et al, 2020) proposes the CTEG model to solve the relation confusion problem of FSRE. Cong (Cong et al, 2020) proposes an inductive clustering based framework, DaFeC, to solve the problem of domain adaptation in FSRE.…”
Section: Introductionmentioning
confidence: 99%