2022
DOI: 10.1145/3503917
|View full text |Cite
|
Sign up to set email alerts
|

MiDTD: A Simple and Effective Distillation Framework for Distantly Supervised Relation Extraction

Abstract: Relation extraction (RE), an important information extraction task, faced the great challenge brought by limited annotation data. To this end, distant supervision was proposed to automatically label RE data, and thus largely increased the number of annotated instances. Unfortunately, lots of noise relation annotations brought by automatic labeling become a new obstacle. Some recent studies have shown that the teacher-student framework of knowledge distillation can alleviate the interference of noise relation a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 68 publications
0
1
0
Order By: Relevance
“…Song et al (2021) integrate ground truth sentence-level identification information into the teacher network during training then transfer it to the student by sharing the classification layer to counter data imbalance problem. KD has also been used to alleviate the interference of noise from relation annotations in distant supervision via label softening (Li et al, 2022).…”
Section: Knowledge Distillation For Rementioning
confidence: 99%
“…Song et al (2021) integrate ground truth sentence-level identification information into the teacher network during training then transfer it to the student by sharing the classification layer to counter data imbalance problem. KD has also been used to alleviate the interference of noise from relation annotations in distant supervision via label softening (Li et al, 2022).…”
Section: Knowledge Distillation For Rementioning
confidence: 99%