2022
DOI: 10.1609/aaai.v36i7.20784
|View full text |Cite
|
Sign up to set email alerts
|

Interpretable Neural Subgraph Matching for Graph Retrieval

Abstract: Given a query graph and a database of corpus graphs, a graph retrieval system aims to deliver the most relevant corpus graphs. Graph retrieval based on subgraph matching has a wide variety of applications, e.g., molecular fingerprint detection, circuit design, software analysis, and question answering. In such applications, a corpus graph is relevant to a query graph, if the query graph is (perfectly or approximately) a subgraph of the corpus graph. Existing neural graph retrieval models compare the node or gr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 24 publications
0
6
0
Order By: Relevance
“…This is realized by jointly computing a similarity score between a pair of graphs. With the cross-graph matching approach, the reasoning of the relation between two graphs is made by modeling the node-to-node interactions [37,40,60,75,79] or by modeling the graph-to-graph interactions [5]. Since the graph-learning model attends a pair of graphs jointly, cross-graph matching methods are potentially stronger than the graph embedding models, and they can be made more resistant to slight variations between graphs.…”
Section: Graph Neural Network and Subgraph Matchingmentioning
confidence: 99%
See 1 more Smart Citation
“…This is realized by jointly computing a similarity score between a pair of graphs. With the cross-graph matching approach, the reasoning of the relation between two graphs is made by modeling the node-to-node interactions [37,40,60,75,79] or by modeling the graph-to-graph interactions [5]. Since the graph-learning model attends a pair of graphs jointly, cross-graph matching methods are potentially stronger than the graph embedding models, and they can be made more resistant to slight variations between graphs.…”
Section: Graph Neural Network and Subgraph Matchingmentioning
confidence: 99%
“…Recently, graph neural networks (GNNs) have achieved significant success in graph representation learning [8] 2 . This led to the development of several learning-based methods for approximate subgraph matching with superior performance [5,37,40,46,60]. At the core of these methods is the learning of an embedding function that maps each graph into an embedding vector encapsulating its key features.…”
Section: Introductionmentioning
confidence: 99%
“…With the rapid development of graph neural networks (GNNs), they have been widely applied to tasks related to subgraph isomorphism. Most works focus on graph similarity matching [20][21][22][23] or subgraph counting [24][25][26], which cannot provide exact matching results. However, another subset of studies [27,28] is dedicated to addressing the subgraph containment problem by leveraging GNNs.…”
Section: Introductionmentioning
confidence: 99%
“…In some extreme cases, the running time and the memory costs are not affordable for some real applications. The other approach is to predict the subgraph counts directly via deep learning [35,102,130,157,192]. These models can be trained on some small graphs and applied to large graphs without re-training, which is time and space-efficient when handling large graphs.…”
Section: Subgraph Counting On General Graphsmentioning
confidence: 99%
“…We note that we do not compare our algorithm with those GNN-based approaches based on two considerations. First, there exist several works that target using GNN to capture the graphs' information and predict the subgraph counts, such as [35,102,130,157,192]. However, all these studies (1) target the static graphs, (2) need to store the whole graph which cannot satisfy the constraint where we just use a limited space and…”
Section: Policy Learningmentioning
confidence: 99%