2019 IEEE Fourth International Conference on Data Science in Cyberspace (DSC) 2019
DOI: 10.1109/dsc.2019.00024
|View full text |Cite
|
Sign up to set email alerts
|

Joint Extraction of Entities and Relations Based on Multi-label Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 12 publications
0
5
0
Order By: Relevance
“…An attention model was introduced to efectively improve the shortcomings in the literature, such as relying on part-of-speech tags. Liu et al [20] proposed a joint classifcation model based on multilabel classifcation. Te model consists of a BiLSTM + DCNN encoder and a decoder for entity, relationship, and joint entity relationship prediction.…”
Section: Knowledge Graphmentioning
confidence: 99%
“…An attention model was introduced to efectively improve the shortcomings in the literature, such as relying on part-of-speech tags. Liu et al [20] proposed a joint classifcation model based on multilabel classifcation. Te model consists of a BiLSTM + DCNN encoder and a decoder for entity, relationship, and joint entity relationship prediction.…”
Section: Knowledge Graphmentioning
confidence: 99%
“…Figure 2 explains how the sentences are tagged by using the BIEOS scheme (Liu et al, 2019 ). In one sentence, multiple tag sequences are given, and each tag sequence only contains a triple.…”
Section: Model Descriptionmentioning
confidence: 99%
“…The encoder part consists of a primary task-shared representation layer (PTSRL), a contextual word representation layer (CWRL), and a relation-based attention module with a gating mechanism (RBAGM). First, a tagging scheme (Liu et al, 2019 ) is applied to convert the joint extraction task to a sequence labeling problem, mapping a sentence into a tag sequence . Then, the PTSRL, composed of BERT (Devlin et al, 2018 ) and bidirectional long short-term memory neural network (BiLSTM) (Schuster and Paliwal, 1997 ), is used to obtain word embedding.…”
Section: Introductionmentioning
confidence: 99%
“…(Wadden et al, 2019) introduced a system that shares embeddings to extract named entities, builds binary relation candidates, and classifies the relation between those. (Zheng et al, 2017;Liu et al, 2019) introduce sequence label schemes to explicitly extract attributes and their relations in a single classification step. Our models similarly extract more complex structures without an enumeration of relation candidates.…”
Section: Related Workmentioning
confidence: 99%
“…Our second approach is motivated by (Liu et al, 2019). They applied a sequence labeling approach instead of sequence tagging.…”
Section: Span Labelingmentioning
confidence: 99%