Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2022
DOI: 10.18653/v1/2022.naacl-main.138
|View full text |Cite
|
Sign up to set email alerts
|

DEGREE: A Data-Efficient Generation-Based Event Extraction Model

Abstract: Event extraction requires high-quality expert human annotations, which are usually expensive. Therefore, learning a data-efficient event extraction model that can be trained with only a few labeled examples has become a crucial challenge. In this paper, we focus on low-resource end-to-end event extraction and propose DE-GREE, a data-efficient model that formulates event extraction as a conditional generation problem. Given a passage and a manually designed prompt, DEGREE learns to summarize the events mentione… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
41
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 69 publications
(41 citation statements)
references
References 20 publications
0
41
0
Order By: Relevance
“…The man returned to Los Angeles from Mexico following his capture Tuesday by bounty hunters. the previous work Hsu et al, 2021), extracting event records one type by one type, using the pretrained encoder-decoder language model BART (Lewis et al, 2020) for conditional generation. For each event type, we first initialize a typespecific prefix consisting of a sequence of tunable vectors as transformer history values (Li and Liang, 2021).…”
Section: Dynamic Type Informationmentioning
confidence: 99%
See 2 more Smart Citations
“…The man returned to Los Angeles from Mexico following his capture Tuesday by bounty hunters. the previous work Hsu et al, 2021), extracting event records one type by one type, using the pretrained encoder-decoder language model BART (Lewis et al, 2020) for conditional generation. For each event type, we first initialize a typespecific prefix consisting of a sequence of tunable vectors as transformer history values (Li and Liang, 2021).…”
Section: Dynamic Type Informationmentioning
confidence: 99%
“…There is a rising trend of casting the task of event extraction as a sequence generation problem by applying special decoding strategies (Paolini et al, 2021;Lu et al, 2021) or steering pretrained language models to output conditional generation sequences with discrete prompts Hsu et al, 2021). Compared with classification-based methods, this line of work is more data-efficient and flexible, which requires less annotated data to achieve acceptable model performances, being easier to extend to new event types by slightly modifying the designed prompts and decoding strategies.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Inspired by the success of pretrained language models and the corresponding natural language generation-based paradigm for various NLP tasks [4,[21][22][23] tackle event extraction as controlled event generation. [6] is an end-to-end conditional generation method with manually designed discrete prompts for each event type, which needs more human effort to find the Fig. 2 The event extraction task.…”
Section: Related Workmentioning
confidence: 99%
“…Natural language generation techniques have been successfully applied to a number of NLP tasks [4][5][6]. They have inspired the use of controlled event generation to tackle event extraction.…”
Section: Introductionmentioning
confidence: 99%