Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.304
|View full text |Cite
|
Sign up to set email alerts
|

Recurrent Interaction Network for Jointly Extracting Entities and Classifying Relations

Abstract: The idea of using multi-task learning approaches to address the joint extraction of entity and relation is motivated by the relatedness between the entity recognition task and the relation classification task. Existing methods using multi-task learning techniques to address the problem learn interactions among the two tasks through a shared network, where the shared information is passed into the taskspecific networks for prediction. However, such an approach hinders the model from learning explicit interactio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(8 citation statements)
references
References 38 publications
0
8
0
Order By: Relevance
“…We compare our model with three kinds of models in recent years: (1) seq2seq-based methods, including CopyRE [5] and WDec [14], (2) MLT-based methods, including GraphRel [18], CopyMTL [16] and RIN [19], (3) tagging-based methods, including NovelTagging [4], ETL-Span [7] and CasRel [8]. Table I shows the results of our models and other baseline methods.…”
Section: Comparison Models and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We compare our model with three kinds of models in recent years: (1) seq2seq-based methods, including CopyRE [5] and WDec [14], (2) MLT-based methods, including GraphRel [18], CopyMTL [16] and RIN [19], (3) tagging-based methods, including NovelTagging [4], ETL-Span [7] and CasRel [8]. Table I shows the results of our models and other baseline methods.…”
Section: Comparison Models and Resultsmentioning
confidence: 99%
“…[18] gained considerable improvement through building relation-weighted graph convolutional networks (GCN). [19] designed a novel multitask learning architecture that enables dynamic interaction and mutual learning between NER and RC, which improves the ability to extract triples. Although effective, they lack the elegance to handle complex scenarios, such as EPO cases.…”
Section: Related Workmentioning
confidence: 99%
“…On all datasets, we run our model 5 times and the averaged results are taken as the final reported results. Baselines Following strong state-of-the-art models are taken as baselines, including: ETL-Span [26], WDec [16], RSAN [27], RIN [19], CasRel [24], TPLinker [23], StereoRel [22], PRGC [33], R-BPtrNet [3], PMEI [20], and CGT [25]. Most results of these baselines are copied from their original papers directly.…”
Section: Experiments 41 Experiments Settingsmentioning
confidence: 99%
“…Baselines We compare our model with following strong state-of-the-art RTE models: CopyRE (Zeng et al, 2018), GraphRel (Fu et al, 2019), Copy-MTL , OrderCopyRE (Zeng et al, 2019), ETL-Span (Yu et al, 2019), WDec (Nayak and Ng, 2020), RSAN , RIN (Sun et al, 2020), CasRel (Wei et al, 2020), TPLinker , SPN (Sui et al, 2020), and PMEI (Sun et al, 2021).…”
Section: Experimental Settingsmentioning
confidence: 99%