2018 IEEE 34th International Conference on Data Engineering (ICDE) 2018
DOI: 10.1109/icde.2018.00241
|View full text |Cite
|
Sign up to set email alerts
|

Incremental Frequent Subgraph Mining on Large Evolving Graphs

Abstract: Frequent subgraph mining is a core graph operation used in many domains. Most existing techniques target static graphs. However, modern applications utilize large evolving graphs. Mining these graphs using existing techniques is infeasible because of the high computational cost. We propose IncGM+, a fast incremental approach for frequent subgraph mining on large evolving graphs. We adapt the notion of "fringe" to the graph context, that is, the set of subgraphs on the border between frequent and infrequent sub… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(24 citation statements)
references
References 6 publications
0
24
0
Order By: Relevance
“…We define the support set σ (G S ) of any k-vertex subgraph G S ∈ C k as the number of k-vertex subgraphs of G that are isomorphic to G S , i.e., σ (G S ) = |C k i |, where G S ∈ C k i . We then define the 1 Notice that the value of T k is simply determined by k, |L |, and |Q |.…”
Section: Problem Definitionmentioning
confidence: 99%
See 1 more Smart Citation
“…We define the support set σ (G S ) of any k-vertex subgraph G S ∈ C k as the number of k-vertex subgraphs of G that are isomorphic to G S , i.e., σ (G S ) = |C k i |, where G S ∈ C k i . We then define the 1 Notice that the value of T k is simply determined by k, |L |, and |Q |.…”
Section: Problem Definitionmentioning
confidence: 99%
“…The closest to our setting is the work by Ray et al [29] which consider a single graph with continuous updates, however their approach is a simple heuristic applicable only to incremental streams and without provable guarantees. Likewise, Abdelhamid et al [1] consider an analogous setting, and propose an exact algorithm which borrows from the literature on incremental pattern mining. The algorithm keeps track of "fringe" subgraph patterns, which are around the frequency threshold, and all their possible expansions/contractions (by adding/removing one edge).…”
Section: Related Workmentioning
confidence: 99%
“…Many matching problem for large-scale and time-evolving graphs have been proposed. Previous work proposed in [1], [4], [5], [6] updates the results partially by the update of input data graph structures and attributes. However, the efficiency of the incremental process depends on the types of the input data graphs and query pattern graphs.…”
Section: Research Challenges and Contributionsmentioning
confidence: 99%
“…That means each vertex in the subgraph corresponds to a vertex in the query graph, and the connectivity is maintained even when the exact pattern does not exist in the input data graph. Additionally, the computation cost of the G-Ray is linear concerning the size Paper Method ISO Attr Time Approx IncGM+ [4] Compute only fringe subgraphs D-ISI [5] Distributed graph pruning Bounded Simulation [1] Query preserving graph compression Turbo ISO [6] Region exploration and COMB/PERM Our Method Adaptive and approximate IGPM and PEM We adopt G-Ray as a baseline of our method since it has following properties, which satisfy part of the requirements we defined.…”
Section: A Approximate Graph Pattern Matchingmentioning
confidence: 99%
“…This algorithm takes memory overhead to maintain maximal frequent subgraphs (MFS) and Minimal infrequent subgraphs (MIFS). [21] MuGraM by Vijay et al takes FSM to a level where multigraphs can be processed for subgraph mining and it tested many datasets like amazon, citeseer etc. this backtracking based algorithm keeps support computation as least as possible and subgraph pruning reduces search space complexity making it better than few other approaches.…”
Section: Existing Fsm Algorithmsmentioning
confidence: 99%