2016
DOI: 10.1007/978-3-319-50496-4_44
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Long Tail Data in Distantly Supervised Relation Extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 12 publications
0
4
0
Order By: Relevance
“…There are few studies on long-tail relation extraction, among which Gui et al [24] proposed an explanation-based approach, while Lei et al [25] utilized external knowledge (logic rules); later studies focused on datasets with hierarchical structures, such as NYT, Han et al [26] proposed a hierarchical attention mechanism scheme for long-tail relations in relation extraction, Zhang et al [27] combined the embedding representation obtained after pre-training by graph convolutional networks [28] into the hierarchical structure of relations, but this class of methods does not work well on other datasets, so this paper uses the traditional approach, i.e., mitigating the data imbalance problem by increasing the loss of small class samples.…”
Section: Related Workmentioning
confidence: 99%
“…There are few studies on long-tail relation extraction, among which Gui et al [24] proposed an explanation-based approach, while Lei et al [25] utilized external knowledge (logic rules); later studies focused on datasets with hierarchical structures, such as NYT, Han et al [26] proposed a hierarchical attention mechanism scheme for long-tail relations in relation extraction, Zhang et al [27] combined the embedding representation obtained after pre-training by graph convolutional networks [28] into the hierarchical structure of relations, but this class of methods does not work well on other datasets, so this paper uses the traditional approach, i.e., mitigating the data imbalance problem by increasing the loss of small class samples.…”
Section: Related Workmentioning
confidence: 99%
“…Krause et al [50] develop an RE system, which can automatically learn the rules for long-tailed relations from Web. To generate fewer but more precise rules, Gui et al [51] apply explanation-based learning to improve the RE system. However, the above methods merely handle each relation in isolation, regardless of the implicit associations between relations.…”
Section: Long-tailed Relation Extraction In Dsrementioning
confidence: 99%
“…There are only a few studies on long-tail for RE (Gui et al, 2016;Lei et al, 2018;Han et al, 2018b). Of these, (Gui et al, 2016) proposed an explanation-based approach, whereas (Lei et al, 2018) utilized external knowledge (logic rules). These studies treat each relation in isolation, regardless of the rich semantic correlations between the relations.…”
Section: Related Workmentioning
confidence: 99%
“…There are only a few studies on long-tail for RE (Gui et al, 2016;Lei et al, 2018;Han et al, 2018b). Of these, (Gui et al, 2016) proposed an explanation-based approach, whereas (Lei et al, 2018) utilized external knowledge (logic rules).…”
Section: Related Workmentioning
confidence: 99%