2022
DOI: 10.3390/app12042185
|View full text |Cite
|
Sign up to set email alerts
|

Commonsense Knowledge-Aware Prompt Tuning for Few-Shot NOTA Relation Classification

Abstract: Compared with the traditional few-shot task, the few-shot none-of-the-above (NOTA) relation classification focuses on the realistic scenario of few-shot learning, in which a test instance might not belong to any of the target categories. This undoubtedly increases the task’s difficulty because given only a few support samples, this cannot represent the distribution of NOTA categories in space. The model needs to make full use of the syntactic information and word meaning information learned in the pre-training… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(11 citation statements)
references
References 28 publications
0
11
0
Order By: Relevance
“…In addition to "real word" tokens, BERT uses some special tokens: Prompt-Tuning The idea behind prompt-tuning is to benefit from BERT's ability to predict masked tokens ([MASK] tokens). It is already used in few-shot RE by [15,5,35,56]. It can be summarized as follows:…”
Section: Relation Encodermentioning
confidence: 99%
See 4 more Smart Citations
“…In addition to "real word" tokens, BERT uses some special tokens: Prompt-Tuning The idea behind prompt-tuning is to benefit from BERT's ability to predict masked tokens ([MASK] tokens). It is already used in few-shot RE by [15,5,35,56]. It can be summarized as follows:…”
Section: Relation Encodermentioning
confidence: 99%
“…Design a prompt P, which is a sequence of tokens that includes one [MASK] token. For instance, Lv et al [35] use the template…”
Section: Relation Encodermentioning
confidence: 99%
See 3 more Smart Citations