Companion Proceedings of the Web Conference 2022 2022
DOI: 10.1145/3487553.3524238
|View full text |Cite
|
Sign up to set email alerts
|

From Discrimination to Generation: Knowledge Graph Completion with Generative Transformer

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 41 publications
(11 citation statements)
references
References 8 publications
0
11
0
Order By: Relevance
“…These models learn structural information of a knowledge graph, regardless of the textual information of the entities and relations. Recently, Yao et al (2019); Wang et al (2021a); Xie et al (2022b); Saxena et al (2022) proposed to encode entity and relation textual knowledge into the model by using PLMs. Instead of calculating scores from embeddings, they train PLMs to produce plausibility scores for KG text representation.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…These models learn structural information of a knowledge graph, regardless of the textual information of the entities and relations. Recently, Yao et al (2019); Wang et al (2021a); Xie et al (2022b); Saxena et al (2022) proposed to encode entity and relation textual knowledge into the model by using PLMs. Instead of calculating scores from embeddings, they train PLMs to produce plausibility scores for KG text representation.…”
Section: Related Workmentioning
confidence: 99%
“…Firstly, to remedy the KG structure information loss caused by the naïve "text-to-text" format, we improve KG-S2S via 1) the input representations of entities and relations using Entity Description, Soft Prompt and Seq2Seq Dropout; 2) the constrained inference algorithm empowered by the Prefix Constraints; Secondly, we treat all the KG elements (i.e., entity, relation and timestamp) as "flat" text (Figure 1) which enables KG-S2S to i) handle various verbalizable knowledge graph structures; ii) generate non-entity text and find novel entities for KGs. We make several improvements on the preliminary attempts of concurrent works (Saxena et al, 2022;Xie et al, 2022b) using Seq2Seq for KGC. Our model adds special treatments to input entity/relation textual representation.…”
Section: Introductionmentioning
confidence: 99%
“…To guarantee the consistency of decoding sequential schemas and tokens in KG, GenKGC [14] proposes an entity-aware hierarchical decoder to constrain 𝑋 𝑡 . In addition, inspired by prompt-learning, GenKGC takes triples with the same relation as demonstrations to implicitly encode structured knowledge.…”
Section: Text-based Kg Representationmentioning
confidence: 99%
“…Comparing to its high time complexity, StAR [12] and SimKGC [13] both introduce a tower-based method to precompute entity embeddings and retrieve top-𝑘 entities efficiently. Further, GenKGC [14] and KGT5 [9] treat knowledge graph completion as sequence-to-sequence generation. Besides, 𝑘NN-KGE [18] is a KG representation model which linearly interpolates its entity distribution by k-nearest neighbors.…”
Section: Model Hubmentioning
confidence: 99%
See 1 more Smart Citation