Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence 2018
DOI: 10.24963/ijcai.2018/360
|View full text |Cite
|
Sign up to set email alerts
|

A Degeneracy Framework for Graph Similarity

Abstract: The problem of accurately measuring the similarity between graphs is at the core of many applications in a variety of disciplines. Most existing methods for graph similarity focus either on local or on global properties of graphs. However, even if graphs seem very similar from a local or a global perspective, they may exhibit different structure at different scales. In this paper, we present a general framework for graph similarity which takes into account structure at multiple different scales. The proposed f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 49 publications
(29 citation statements)
references
References 16 publications
0
29
0
Order By: Relevance
“…Experimental Setup: We compare the performance of the proposed BASGCN model on graph classification applications with a) six alternative state-of-the-art graph kernels and b) twelve alternative state-of-the-art deep learning methods for graphs. Specifically, the graph kernels include 1) the Jensen-Tsallis q-difference kernel (JTQK) with q = 2 [37], 2) the Weisfeiler-Lehman subtree kernel (WLSK) [11], 3) the shortest path graph kernel (SPGK) [38], 4) the shortest path kernel based on core variants (CORE SP) [39], 5) the random walk graph kernel (RWGK) [40], and 6) the graphlet count kernel (GK) [41]. On the other hand, the deep learning methods include 1) the deep graph convolutional neural network (DGCNN) [31], 2) the PATCHY-SAN based convolutional neural network for graphs (PSGCNN) [32], 3) the diffusion convolutional neural network (DCNN) [25], 4) the deep graphlet kernel (DGK) [42], 5) the graph capsule convolutional neural network (GCCNN) [43], 6) the anonymous walk embeddings based on feature driven (AWE) [44], 7) the edge-conditioned convolutional networks (ECC) [45], 8) the high-order graph convolution network (HO-GCN) [46], 9) the graph convolution network based on Differentiable Pooling (DiffPool) [47], 10) the graph convolution network based on Self-Attention Pooling (SAGPool) [48], 11) the graph convolutional network with EigenPooling (EigenPool) [48], and 12) the degree-specific graph neural networks (DEMO-Net) [49].…”
Section: A Comparisons On Graph Classificationmentioning
confidence: 99%
“…Experimental Setup: We compare the performance of the proposed BASGCN model on graph classification applications with a) six alternative state-of-the-art graph kernels and b) twelve alternative state-of-the-art deep learning methods for graphs. Specifically, the graph kernels include 1) the Jensen-Tsallis q-difference kernel (JTQK) with q = 2 [37], 2) the Weisfeiler-Lehman subtree kernel (WLSK) [11], 3) the shortest path graph kernel (SPGK) [38], 4) the shortest path kernel based on core variants (CORE SP) [39], 5) the random walk graph kernel (RWGK) [40], and 6) the graphlet count kernel (GK) [41]. On the other hand, the deep learning methods include 1) the deep graph convolutional neural network (DGCNN) [31], 2) the PATCHY-SAN based convolutional neural network for graphs (PSGCNN) [32], 3) the diffusion convolutional neural network (DCNN) [25], 4) the deep graphlet kernel (DGK) [42], 5) the graph capsule convolutional neural network (GCCNN) [43], 6) the anonymous walk embeddings based on feature driven (AWE) [44], 7) the edge-conditioned convolutional networks (ECC) [45], 8) the high-order graph convolution network (HO-GCN) [46], 9) the graph convolution network based on Differentiable Pooling (DiffPool) [47], 10) the graph convolution network based on Self-Attention Pooling (SAGPool) [48], 11) the graph convolutional network with EigenPooling (EigenPool) [48], and 12) the degree-specific graph neural networks (DEMO-Net) [49].…”
Section: A Comparisons On Graph Classificationmentioning
confidence: 99%
“…Important approaches include random-walk and shortest paths based kernels [7,31,4,18], as well as the Weisfeiler-Lehman subtree kernel [30,23]. Further recent works focus on assignment-based approaches [16,27], spectral approaches [15], and graph decomposition approaches [26]. There has been some work considering dynamic graphs.…”
Section: Related Workmentioning
confidence: 99%
“…In a very recent work [113], the hierarchy of the core decomposition is utilized to provide a general framework for computing similarity metrics among graphs.…”
Section: Graph Similaritymentioning
confidence: 99%
“…trees, cycles) are compared between graphs in a local or global level with graph kernels. The aforementioned work [113], contributes by utilizing existing kernels at equivalent core levels between graphs in order to compare the structures at similar levels of connectivity.…”
Section: Graph Similaritymentioning
confidence: 99%