2019
DOI: 10.1609/aaai.v33i01.33017484
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Labeled Relation Extraction with Attentive Capsule Network

Abstract: To disclose overlapped multiple relations from a sentence still keeps challenging. Most current works in terms of neural models inconveniently assuming that each sentence is explicitly mapped to a relation label, cannot handle multiple relations properly as the overlapped features of the relations are either ignored or very difficult to identify. To tackle with the new issue, we propose a novel approach for multi-labeled relation extraction with capsule network which acts considerably better than current convo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
26
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 56 publications
(27 citation statements)
references
References 3 publications
1
26
0
Order By: Relevance
“…They also found that capsule networks exhibit significant improvement when transferring single-label to multi-label text classification. Similar property has also been observed in the task of relation extraction (Zhang et al, 2019. However, interactive word-level attention is not considered in these typical capsule routing methods.…”
Section: Capsule Networksupporting
confidence: 67%
See 1 more Smart Citation
“…They also found that capsule networks exhibit significant improvement when transferring single-label to multi-label text classification. Similar property has also been observed in the task of relation extraction (Zhang et al, 2019. However, interactive word-level attention is not considered in these typical capsule routing methods.…”
Section: Capsule Networksupporting
confidence: 67%
“…Capsule network (Hinton et al, 2011;Sabour et al, 2017;Hinton et al, 2018) constructs vectorbased feature representation. Capsules in adjacent layers are connected by dynamic routing, which shows strengths in distinguishing overlapped features by feature clustering (Sabour et al, 2017;Zhang et al, 2019). In the aspect-level sentimental classification task, the vector-based overlapped sentimental features towards different aspect terms will be clustered by an Expectation-Maximization (EM) routing algorithm, which makes the subsequent classification more clear.…”
Section: Introductionmentioning
confidence: 99%
“…To reduce the impact of noisy words, tree-based methods attempt to obtain the relevant sub-structure of an instance for relation extraction (Xu et al, 2015;Miwa and Bansal, 2016;Liu et al, 2018). To discriminate overlapped relation features, (Zhang et al, 2019) apply the capsule network (Sabour et al, 2017) for multi-labeled relation extraction. Inspired by the ability of multi-head attention in modeling the long-term dependency (Vaswani et al, 2017), (Zhang et al, 2020) attempt to reduce multi-granularity noise via multi-head attention in relation extraction.…”
Section: Related Workmentioning
confidence: 99%
“…As shown in Figure 1 (Zeng et al, 2014) or self-attention (Lin et al, 2016). Although (Zhang et al, 2019) first propose an attentive capsule network for multi-labeled relation extraction, it treats the CNN/RNN as low-level capsules without the diversity encouragement, which poses the difficulty of distinguishing different and overlapped relation features from a single type of semantic capsule. Therefore, a well-behaved relation extractor is needed to discriminate diverse overlapped relation features from different semantic spaces.…”
Section: Introductionmentioning
confidence: 99%
“…They treated aggregation as a routing problem so that complex and interleaved features could be identified by the capsule network. Zhang et al [28] proposed capsule networks with dynamic routing for multi-labelled relation extraction to address the challenge of having multiple relations in a sentence. To better cluster features, they devised a dynamic routing algorithm based on an attention mechanism.…”
Section: Capsule Network For Relation Extractionmentioning
confidence: 99%