Proceedings of the Web Conference 2020 2020
DOI: 10.1145/3366423.3380188
|View full text |Cite
|
Sign up to set email alerts
|

Generalizing Tensor Decomposition for N-ary Relational Knowledge Bases

Abstract: With the rapid development of knowledge bases (KBs), link prediction task, which completes KBs with missing facts, has been broadly studied in especially binary relational KBs (a.k.a knowledge graph) with powerful tensor decomposition related methods. However, the ubiquitous n-ary relational KBs with higher-arity relational facts are paid less attention, in which existing translation based and neural network based approaches have weak expressiveness and high complexity in modeling various relations. Tensor dec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
66
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 59 publications
(66 citation statements)
references
References 31 publications
0
66
0
Order By: Relevance
“…To preserve the whole expressivity of the KG, a set of new KGE-Methods are developed to directly operate on hypergraphs and hyper-relational graphs. Some of the methods that deal with hypergraphs are HEBE [52], HGE [157], Hyper2vec [59], HNN [36], HCN [154], DHNE [134], HHNE [9], Hyper-SAGNN [164], HypE [35] and methods that embedd hyper-relational graphs are for instance m-TransH [151], HSimple [35], RAE [163], GETD [84], TuckER [7], NaLP [51], HINGE [116], StarE [40].…”
Section: Kge-methods -Input Typementioning
confidence: 99%
“…To preserve the whole expressivity of the KG, a set of new KGE-Methods are developed to directly operate on hypergraphs and hyper-relational graphs. Some of the methods that deal with hypergraphs are HEBE [52], HGE [157], Hyper2vec [59], HNN [36], HCN [154], DHNE [134], HHNE [9], Hyper-SAGNN [164], HypE [35] and methods that embedd hyper-relational graphs are for instance m-TransH [151], HSimple [35], RAE [163], GETD [84], TuckER [7], NaLP [51], HINGE [116], StarE [40].…”
Section: Kge-methods -Input Typementioning
confidence: 99%
“…Recently, GETD (Liu et al, 2020) extended TuckER (Balazevic et al, 2019) tensor factorization approach for n-ary relational facts. However, the model still expects only one relation in a fact and is not able to process facts of different arity in one dataset, e.g., 3-ary and 4-ary facts have to be split and trained separately.…”
Section: Related Workmentioning
confidence: 99%
“…Recent approaches (Guan et al, 2019;Liu et al, 2020;Rosso et al, 2020) for embedding hyperrelational KGs often use WikiPeople and JF17K as benchmarking datasets. We advocate that those datasets can not fully capture the task complexity.…”
Section: Wd50k Datasetmentioning
confidence: 99%
See 1 more Smart Citation
“…Each hyper-relational fact has a base triple (h, r, t) and additional key-value (relationentity) pairs (k, v) (e.g., {(academic major, physical), (academic degree, Master of Science)}). A line of work converts hyper-relational facts to n-ary meta-relations r(e 1 , ..., e n ) and leverages translational distance embedding [66,70], spatiotranslational embedding [1], tensor factorization [44] for modeling. Other approaches directly learn hyper-relational facts in their original form using various techniques, including convolutional neural networks, graph neural networks, and transformer models [53,24].…”
Section: B Modeling N-ary Factsmentioning
confidence: 99%