2009
DOI: 10.1109/tnn.2008.2005141
|View full text |Cite
|
Sign up to set email alerts
|

Computational Capabilities of Graph Neural Networks

Abstract: In this paper, we will consider the approximation properties of a recently introduced neural network model called graph neural network (GNN), which can be used to process-structured data inputs, e.g., acyclic graphs, cyclic graphs, and directed or undirected graphs. This class of neural networks implements a function tau(G,n) is an element of IR(m) that maps a graph G and one of its nodes n onto an m-dimensional Euclidean space. We characterize the functions that can be approximated by GNNs, in probability, up… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

3
116
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 194 publications
(119 citation statements)
references
References 29 publications
3
116
0
Order By: Relevance
“…e now restate the definition of an unfolding tree-first given by Scarselli et al [3]-which leads to a natural equivalence relation between vertices of a given graph. Informally, the unfolding tree of a vertex v in a labelled graph G can be thought of as a decision tree of walks starting at v in G. We start with a vertex corresponding to the length 0 walk ⟨v⟩, and then connect all of the length 1 walks starting at v to the vertex ⟨v⟩.…”
Section: Chaptermentioning
confidence: 99%
See 2 more Smart Citations
“…e now restate the definition of an unfolding tree-first given by Scarselli et al [3]-which leads to a natural equivalence relation between vertices of a given graph. Informally, the unfolding tree of a vertex v in a labelled graph G can be thought of as a decision tree of walks starting at v in G. We start with a vertex corresponding to the length 0 walk ⟨v⟩, and then connect all of the length 1 walks starting at v to the vertex ⟨v⟩.…”
Section: Chaptermentioning
confidence: 99%
“…nfolding trees were first presented by Scarselli et al in a paper on their Graph Neural Network (GNN) model [3]. The GNN model is a neural net architecture designed to predictively model properties of graph-structured data, such as molecular structures or a set of interlinked web pages.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…Examples include structured data such as bioinformatics sequences, graphs, or tree structures as they occur in linguistics, time series data, functional data arising in mass spectrometry, relational data stored in relational databases, etc. In consequence, a variety of techniques has been developed to extend powerful statistical machine learning tools towards non-vectorial data such as kernel methods using structure kernels, recursive and graph networks, functional methods, relational approaches, and similar [9,12,5,27,6,26,10,11]. One very prominent way to extend statistical machine learning tools is offered by the choice of problemspecific measures of data proximity, which can often directly be used in machine learning tools based on similarities, dissimilarities, distances, or kernels.…”
Section: Introductionmentioning
confidence: 99%
“…GNNs have been proposed to process different types of graph theoretic problems such as subgraph matching [5], clique problem [6], Half Hot problem [6], Mutagenesis problem [5], and Tree depth problem[2]. Pucci et al [7], have tested the GNN model on the Movie Lens data set, and pointed out some limitations of Graph Neural Networks when applied to recommender system.…”
Section: Introductionmentioning
confidence: 99%