2020 IEEE 32nd International Conference on Tools With Artificial Intelligence (ICTAI) 2020
DOI: 10.1109/ictai50040.2020.00154
|View full text |Cite
|
Sign up to set email alerts
|

Graph Attention Auto-Encoders

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
49
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 75 publications
(49 citation statements)
references
References 24 publications
0
49
0
Order By: Relevance
“…To better capture the relevance between nodes and neighbors, we employ attention mechanism [32]. To be precise, the relevance of a neighboring node j to node i in the s-th graph can defined as:…”
Section: B Self-reconstructionmentioning
confidence: 99%
“…To better capture the relevance between nodes and neighbors, we employ attention mechanism [32]. To be precise, the relevance of a neighboring node j to node i in the s-th graph can defined as:…”
Section: B Self-reconstructionmentioning
confidence: 99%
“…Then STGATE learns low-dimensional latent embeddings with both spatial information and gene expressions via a graph attention auto-encoder [14] (Fig. 1).…”
Section: Overview Of Stgatementioning
confidence: 99%
“…In this paper, we regard cold-start as a missing data problem where user-item interaction data is missing. Inspired by denoising auto-encoders [26] that train the model to reconstruct the input from a corrupted version, as well as the classical framework graph auto-encoders [4,20,31], we propose a novel model named Multi-view Denoising Graph Auto-Encoders (MvDGAE) on HINS to alleviate the cold-start problem. We first extract multifaceted meaningful semantics reflected by meta-path on HINs as multiviews for both users and items, effectively enhancing user/item relationships on different aspects.…”
Section: Introductionmentioning
confidence: 99%