2018
DOI: 10.48550/arxiv.1812.00086
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Graph Node-Feature Convolution for Representation Learning

Abstract: Graph convolutional network (GCN) is an emerging neural network approach. It learns new representation of a node by aggregating feature vectors of all neighbors in the aggregation process without considering whether the neighbors or features are useful or not. Recent methods have improved solutions by sampling a fixed size set of neighbors, or assigning different weights to different neighbors in the aggregation process, but features within a feature vector are still treated equally in the aggregation process.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…To meet this requirement, we model the object layout as an undirected graph. We represent this graph in terms of Feature Matrix 𝑋 and Adjacency Matrix 𝐴 [20,45]. Let p be the maximum possible number of parts across all object categories, i.e.…”
Section: Boxgcn-vaementioning
confidence: 99%
“…To meet this requirement, we model the object layout as an undirected graph. We represent this graph in terms of Feature Matrix 𝑋 and Adjacency Matrix 𝐴 [20,45]. Let p be the maximum possible number of parts across all object categories, i.e.…”
Section: Boxgcn-vaementioning
confidence: 99%
“…citation networks [38], WebKB graphs [1]) that have thousands of nodes and edges. Moreover, it was observed in [28,53] that the common instantiation of the a ention mechanism on GNNs (i.e. GATs) does not necessarily bring performance boost over standard GCNs on distinct graph datasets.…”
Section: Datasetmentioning
confidence: 99%
“…They were introduced in [21], which consist of an iterative process aggregating and transforming representation vectors of its neighboring nodes to capture structural information. Recently, several variants have been proposed, which employ self-attention mechanism [40] or improve network architectures [43,48] to boost the performance. However, most of them are based on empirical intuition and heuristics.…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, learning to understand graphs is a crucial problem in machine learning. Previous studies in the literature generally fall into two main categories: (1) graph classification [8,21,42,49,50], where the whole structure of graphs is captured for similarity comparison; (2) node classification [1,21,40,43,48], where the structural identity of nodes is determined for representation learning.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation