ESANN 2023 Proceesdings 2023
DOI: 10.14428/esann/2023.es2023-4
|View full text |Cite
|
Sign up to set email alerts
|

Graph Representation Learning

Davide Bacciu,
Federico Errica,
Alessio Micheli
et al.

Abstract: In a broad range of real-world machine learning applications, representing examples as graphs is crucial to avoid a loss of information. For this reason, in the last few years, the definition of machine learning methods, particularly neural networks, for graph-structured inputs has been gaining increasing attention. In particular, Deep Graph Networks (DGNs) are nowadays the most commonly adopted models to learn a representation that can be used to address different tasks related to nodes, edges, or even entire… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…P a t h s S t r a t i f i e d P a t h s C l i q u e s S t r a t i f i e d C l i q u e s In Table 3, we finally show a brief comparison against current approaches for graph classification. Competitors span a variety of techniques, including classifiers working on the top of GEDs [49,67], kernel methods [68][69][70][71] and several embedding techniques [68,72,73], including Granular Computing-based [32,74] and those based on neural networks and deep learning [75][76][77][78][79]. We can see that our method has comparable performances against current approaches in the graph classification literature.…”
Section: Computational Resultsmentioning
confidence: 80%
“…P a t h s S t r a t i f i e d P a t h s C l i q u e s S t r a t i f i e d C l i q u e s In Table 3, we finally show a brief comparison against current approaches for graph classification. Competitors span a variety of techniques, including classifiers working on the top of GEDs [49,67], kernel methods [68][69][70][71] and several embedding techniques [68,72,73], including Granular Computing-based [32,74] and those based on neural networks and deep learning [75][76][77][78][79]. We can see that our method has comparable performances against current approaches in the graph classification literature.…”
Section: Computational Resultsmentioning
confidence: 80%
“…Research on Spectral GCNs shows that graph convolution in essence implements a neighborhood aggregation scheme where node features are updated through neigh-borhood aggregation that is similar to image convolution [6]. This leads to the proposal of spatial neural networks [7]- [11], which have a simpler structure and can be easily embedded into deep learning models with prior neural network experience. Therefore, spatial GNNs have been developing very fast and become popular due to their convincing performance.…”
Section: Introductionmentioning
confidence: 99%