2019
DOI: 10.1007/978-3-030-32233-5_18
|View full text |Cite
|
Sign up to set email alerts
|

KG-to-Text Generation with Slot-Attention and Link-Attention

Abstract: Slot attention has shown remarkable objectcentric representation learning performance in computer vision tasks without requiring any supervision. Despite its object-centric binding ability brought by compositional modelling, as a deterministic module, slot attention lacks the ability to generate novel scenes. In this paper, we propose the Slot-VAE, a generative model that integrates slot attention with the hierarchical VAE framework for object-centric structured scene generation. For each image, the model simu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 65 publications
0
6
0
Order By: Relevance
“…Papers Human-centric (HC) [50], [53], [100] Machine-centric (MC) [60], [63]- [67], [83], [84], [86] [3], [69]- [71], [74], [95], [99], [122] [72], [73], [75]- [80] [21], [82], [90]- [92], [94], [102], [125] [6], [48], [49], [51], [52], [55], [56], [126] [18], [57], [58], [115], [119], [127], [128] [104], [112]- [114], [120], [129] [105], [106], [108], [110], [116], [117], [123] Both [5], [54], [59],…”
Section: Metrics Groupmentioning
confidence: 99%
“…Papers Human-centric (HC) [50], [53], [100] Machine-centric (MC) [60], [63]- [67], [83], [84], [86] [3], [69]- [71], [74], [95], [99], [122] [72], [73], [75]- [80] [21], [82], [90]- [92], [94], [102], [125] [6], [48], [49], [51], [52], [55], [56], [126] [18], [57], [58], [115], [119], [127], [128] [104], [112]- [114], [120], [129] [105], [106], [108], [110], [116], [117], [123] Both [5], [54], [59],…”
Section: Metrics Groupmentioning
confidence: 99%
“…Currently, researchers are studying how to integrate neural networks with knowledge bases. Some research works focused on table-to-text [8][9][10]30], where a table that depicts a "plain" knowledge base is read by a sequence-to-sequence model and transformed into a text that reports (almost) all information occurring in the table. In Wang et al [30], the authors constructed both a latent graph over the entries of the table, looking to their position in the input, and an attention over them.…”
Section: Related Workmentioning
confidence: 99%
“…To solve this drawback, we examined the recent works on neural network models based on knowledge bases (KBs) [8][9][10][11][12][13][14]. In these research works, the knowledge graph created using an external KB was explored by the neural network to extract the relevant information (i.e., it selected one or more vertexes) via the attention mechanism, which focused on the relevant entities of the graph.…”
Section: Introductionmentioning
confidence: 99%
“…The process of converting data represented as knowledge graphs into text is sometimes referred to as graph to text (Schmitt et al, 2020;, or KG to text (Schmitt et al, 2021;Wang et al, 2019). The term Data-to-Text typically refers to a more general group of tasks of which KG-to-text is part (Nan et al, 2021;Yin and Wan, 2022;Ji et al, 2023).…”
Section: Kg-to-text Synthesismentioning
confidence: 99%