2023
DOI: 10.48550/arxiv.2301.08210
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Everything is Connected: Graph Neural Networks

Abstract: In this survey, I will present a vibrant and exciting area of deep learning research: graph representation learning. Or, put simply, building machine learning models over data that lives on graphs (interconnected structures of nodes connected by edges). These models are commonly known as graph neural networks, or GNNs for short.There is very good reason to study data on graphs. From the molecule (a graph of atoms connected by chemical bonds) all the way to the connectomic structure of the brain (a graph of neu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 20 publications
0
1
0
Order By: Relevance
“…On the other hand, a Transformer can be perceived as a fully-connected (all vertices are connected to all vertices) Graph Neural Network with trainable edge weights given by a self-attention [37]. From a practical perspective, the empirical success of the Transformer stems from its ability to learn highly complex and useful patterns.…”
Section: Background Transformersmentioning
confidence: 99%
“…On the other hand, a Transformer can be perceived as a fully-connected (all vertices are connected to all vertices) Graph Neural Network with trainable edge weights given by a self-attention [37]. From a practical perspective, the empirical success of the Transformer stems from its ability to learn highly complex and useful patterns.…”
Section: Background Transformersmentioning
confidence: 99%