2022
DOI: 10.5715/jnlp.29.187
|View full text |Cite
|
Sign up to set email alerts
|

Named Entity Recognition and Relation Extraction Using Enhanced Table Filling by Contextualized Representations

Abstract: In this study, we propose a method designed to extract named entities and relations from unstructured text based on table representations. To extract named entities, the proposed method computes representations for entity mentions and long-range dependencies using contextualized representations without hand-crafted features or complex neural network architectures. To extract relations, it applies a tensor dot product to predict all relation labels simultaneously without considering dependencies among relation … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 17 publications
(21 citation statements)
references
References 25 publications
0
21
0
Order By: Relevance
“…Although the system is simple and effective, it ignores the dependencies among predicted relation labels. As noted in Ma et al (2022), this does not improve the performance with label dependencies incorporated through refined decoding orders.…”
Section: Introductionmentioning
confidence: 91%
See 4 more Smart Citations
“…Although the system is simple and effective, it ignores the dependencies among predicted relation labels. As noted in Ma et al (2022), this does not improve the performance with label dependencies incorporated through refined decoding orders.…”
Section: Introductionmentioning
confidence: 91%
“…Yan et al (2021) applied a partition filter to divide neurons into multiple partitions and generated taskspecific features based on a linear combination of these partitions. Moreover, Ma et al (2022) for RE. This study applies their system as a strong baseline and explores the effect of incorporating local dependencies at the top of BERT.…”
Section: Ner and Re Using Contextualized Representationsmentioning
confidence: 99%
See 3 more Smart Citations