Proceedings of the Conference Recent Advances in Natural Language Processing - Deep Learning for Natural Language Processing Me 2021
DOI: 10.26615/978-954-452-072-4_116
|View full text |Cite
|
Sign up to set email alerts
|

Improving Distantly Supervised Relation Extraction with Self-Ensemble Noise Filtering

Abstract: Distantly supervised models are very popular for relation extraction since we can obtain a large amount of training data using the distant supervision method without human annotation. In distant supervision, a sentence is considered as a source of a tuple if the sentence contains both entities of the tuple. However, this condition is too permissive and does not guarantee the presence of relevant relation-specific information in the sentence. As such, distantly supervised training data contains much noise which… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…Hao et al (2021) adopts adversarial training to filter noisy instances in the dataset. Nayak et al (2021) designs a self-ensemble framework to filter noisy instances despite information loss. proposes a hierarchical contrastive learning framework to reduce the effect of noise.…”
Section: Related Workmentioning
confidence: 99%
“…Hao et al (2021) adopts adversarial training to filter noisy instances in the dataset. Nayak et al (2021) designs a self-ensemble framework to filter noisy instances despite information loss. proposes a hierarchical contrastive learning framework to reduce the effect of noise.…”
Section: Related Workmentioning
confidence: 99%