2020
DOI: 10.1007/978-3-030-47436-2_18
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting the Matching Information in the Support Set for Few Shot Event Classification

Abstract: The existing event classification (EC) work primarily focuses on the traditional supervised learning setting in which models are unable to extract event mentions of new/unseen event types. Few-shot learning has not been investigated in this area although it enables EC models to extend their operation to unobserved event types. To fill in this gap, in this work, we investigate event classification under the few-shot learning setting. We propose a novel training method for this problem that extensively exploit t… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…Furthermore, our extensive experiments on two benchmark datasets show the superior performance of our method. In the future, we will extend our analysis to other Information Extraction tasks, such as Entity Mention Detection (Nguyen et al, 2016) and Event Classification (Lai et al, 2020;.…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, our extensive experiments on two benchmark datasets show the superior performance of our method. In the future, we will extend our analysis to other Information Extraction tasks, such as Entity Mention Detection (Nguyen et al, 2016) and Event Classification (Lai et al, 2020;.…”
Section: Discussionmentioning
confidence: 99%
“…The above three assumptions also apply to few-shot learning, which is a developing branch of meta-learning. Few-shot learning has been increasingly introduced into the field of NLP, such as text classification [6][7], word sense disambiguation [8], event detection [9][10], and so on.…”
Section: Meta-learningmentioning
confidence: 99%
“…A large body of previous ED research is dedicated to monolingual learning, i.e., training and testing over the same languages (Nguyen et al, 2016;Yang and Mitchell, 2016;Lu and Nguyen, 2018;Lai et al, 2020;Lin et al, 2020). The models might consider different domains Man Duc Trong et al, 2020).…”
Section: Related Workmentioning
confidence: 99%