2020
DOI: 10.1007/s11431-020-1673-6
|View full text |Cite
|
Sign up to set email alerts
|

A survey on neural relation extraction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 24 publications
(10 citation statements)
references
References 38 publications
0
10
0
Order By: Relevance
“…We identified several surveys that address general relation extraction [50][51][52][53][54][55]. In contrast, there are fewer surveys dedicated to the topics of event extraction [56] and temporal relation extraction [57,58].…”
Section: Hand Crafted Featuresmentioning
confidence: 99%
“…We identified several surveys that address general relation extraction [50][51][52][53][54][55]. In contrast, there are fewer surveys dedicated to the topics of event extraction [56] and temporal relation extraction [57,58].…”
Section: Hand Crafted Featuresmentioning
confidence: 99%
“…Supervised entity relation extraction is essentially a classification task that can be performed using machine-learning models trained using annotated data. Recently, neural network-based models have been widely employed (Liu, 2020). The most significant challenge in supervised entity relation extraction is that annotated training data are scarce in most domains (Vo and Bagheri, 2019).…”
Section: Related Researchmentioning
confidence: 99%
“…Moreover, this table is also supplemented with core techniques used to solve these problems; it is clearly indicated that deep learning techniques are the most widely researched and applied to solve these problems. For more extensive reviews of the techniques, as well as more discussions on their weaknesses or future prospects, we refer to recent survey NLP papers such as [76,77,78,79,80]. Additionally, their performance can be significantly boosted after applying transfer learning with pretrained language models, such as BERT [39], ELMO [41], RoBERTa [81], ELECTRA [82], XLNet (83), T5 [84] or Microsoft's DeBERTa [85].…”
Section: Introducing Nlp To Model-to-model (M2m) Transformationsmentioning
confidence: 99%