2021
DOI: 10.1613/jair.1.13225
|View full text |Cite
|
Sign up to set email alerts
|

Graph Kernels: A Survey

Abstract: Graph kernels have attracted a lot of attention during the last decade, and have evolved into a rapidly developing branch of learning on structured data. During the past 20 years, the considerable research activity that occurred in the field resulted in the development of dozens of graph kernels, each focusing on specific structural properties of graphs. Graph kernels have proven successful in a wide range of domains, ranging from social networks to bioinformatics. The goal of this survey is to provide a unify… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 61 publications
(44 citation statements)
references
References 136 publications
0
34
0
Order By: Relevance
“…A simple unlabeled graph is usually represented through an adjacency matrix A(G X ) ∈ R N,N , whose elements A ij are set to one if two different nodes i and j are linked by an edge or to zero otherwise 45,47 . However, the same graph would correspond to an infinite number of conformations through this definition: by changing the interatomic distances, as long as the edges are preserved (i.e.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…A simple unlabeled graph is usually represented through an adjacency matrix A(G X ) ∈ R N,N , whose elements A ij are set to one if two different nodes i and j are linked by an edge or to zero otherwise 45,47 . However, the same graph would correspond to an infinite number of conformations through this definition: by changing the interatomic distances, as long as the edges are preserved (i.e.…”
Section: Methodsmentioning
confidence: 99%
“…which is commonly used also in other graphs application [45][46][47]52 . This normalization can be seen as a Jaccard's distance -i.e.…”
Section: 3mentioning
confidence: 99%
“…1. graph representation learning [73], such as message passing neural networks (MPNNs) [74,75] that learn task-specific vector representations of molecular graphs for prediction tasks in an end-to-end manner 2. graph kernels [76][77][78][79][80][81], which (loosely speaking) measure the similarity between any two input graphs, allowing for the use of kernel methods [82], such as support vector machines [83], kernel regression/classification [82], and Gaussian processes [84], for prediction tasks.…”
Section: Representing Molecules For Supervised Machine Learning Tasksmentioning
confidence: 99%
“…This is unlike the modular approach of GraphDCA which measures similarity in terms of different node representations. For a recent overview of graph kernels, see (Nikolentzos et al, 2021).…”
Section: Graph Comparisonmentioning
confidence: 99%