Proceedings of the Web Conference 2021 2021
DOI: 10.1145/3442381.3449914
|View full text |Cite
|
Sign up to set email alerts
|

Heterogeneous Graph Neural Network via Attribute Completion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
4
1

Relationship

2
7

Authors

Journals

citations
Cited by 129 publications
(33 citation statements)
references
References 20 publications
0
33
0
Order By: Relevance
“…GTN [54], which generates new graph structures by identifying useful connections between unconnected nodes on the original graph, can learn effective node embeddings on the new graphs in an end-to-end fashion. HGNN-AC [55] based on reference 53 proposed a general framework for heterogeneous graph neural network via Attribute Completion, including pre-learning of topological embedding and attribute completion with attention mechanism. These heterogeneous graph neural network representation methods enhance the representation ability of nodes and provide a more practical idea for downstream tasks.…”
Section: Heterogeneous Graph Neural Networkmentioning
confidence: 99%
“…GTN [54], which generates new graph structures by identifying useful connections between unconnected nodes on the original graph, can learn effective node embeddings on the new graphs in an end-to-end fashion. HGNN-AC [55] based on reference 53 proposed a general framework for heterogeneous graph neural network via Attribute Completion, including pre-learning of topological embedding and attribute completion with attention mechanism. These heterogeneous graph neural network representation methods enhance the representation ability of nodes and provide a more practical idea for downstream tasks.…”
Section: Heterogeneous Graph Neural Networkmentioning
confidence: 99%
“…use node attribute to construct a K-Nearest Neighbor (KNN) graph, thereby enhancing the model's learning ability without modifying the original graph topology in the graph learning process. At the same time, some studies use joint optimization of multiple highly related tasks to estimate the missing information in a single task Jin et al [2021aJin et al [ , 2020, others use pre-training Lu et al [2021a] to accelerate the learning process. In addition, self-supervised methods can be used as another competitive macro-level augmentation strategy.…”
Section: Macro-level Augmentation Strategiesmentioning
confidence: 99%
“…Experimental results indicate the effectiveness of the proposed method in link prediction and attribute completion tasks. HGNN-AC Jin et al [2021a] use graph attention mechanism to complete the missing attributes of nodes in heterogeneous graphs, avoiding using previous hand-crafted ways to solve this problem.…”
Section: Gal For Low-quality Datamentioning
confidence: 99%
“…Graph Convolutional Network (GCN) (Kipf and Welling 2017), which is famous for its effectiveness in graph representation learning, follows the principal learning mechanism where adjacency nodes attain similar representations through aggregating their neighboring information. The representations support subsequent downstream tasks such as node classification (Duvenaud et al 2015;Jin et al 2021a;He et al 2021;Hu et al 2019) and link prediction (Schlichtkrull et al 2018;Zhang and Chen 2018; Kipf and Welling 2016;Cao et al 2021).…”
Section: Introductionmentioning
confidence: 99%