2010 IEEE International Conference on Data Mining Workshops 2010
DOI: 10.1109/icdmw.2010.63
|View full text |Cite
|
Sign up to set email alerts
|

Cluster Cores and Modularity Maximization

Abstract: The modularity function is a widely used measure for the quality of a graph clustering. Finding a clustering with maximal modularity is NP-hard. Thus, only heuristic algorithms are capable of processing large datasets. Extensive literature on such heuristics has been published in the recent years. We present a fast randomized greedy algorithm which uses solely local information on gradients of the objective function. Furthermore, we present an approach which first identifies the 'cores' of clusters before calc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
12
0
1

Year Published

2012
2012
2014
2014

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 16 publications
(13 citation statements)
references
References 30 publications
0
12
0
1
Order By: Relevance
“…The objective of this contribution is to present a new graph clustering scheme, called the Core Groups Graph Cluster (CGGC) scheme, which is able to find high quality clustering by using an ensemble learning approach. In [19] we presented an algorithm called RG+ for maximizing the modularity of a graph partition via an intermediate step of first identifying core groups of vertices. The RG+ algorithm was able to outperform all previously published heuristics in terms of optimization quality.…”
Section: Introductionmentioning
confidence: 99%
“…The objective of this contribution is to present a new graph clustering scheme, called the Core Groups Graph Cluster (CGGC) scheme, which is able to find high quality clustering by using an ensemble learning approach. In [19] we presented an algorithm called RG+ for maximizing the modularity of a graph partition via an intermediate step of first identifying core groups of vertices. The RG+ algorithm was able to outperform all previously published heuristics in terms of optimization quality.…”
Section: Introductionmentioning
confidence: 99%
“…Schapire could show that strong classifiers can be derived from weak classifiers that are only slightly better than random choice [18]. The general outline of an ensemble learning clustering algorithm for community detection is roughly as follows [14]: (1) Create a set of partitions with some (weak) learning algorithm, (2) Identify the maximal overlap of the partitions as depicted in Figure 1, (3) Continue the search from the maximal overlap with the algorithm used in (1) or any other appropriate algorithm.…”
Section: Community Detection and Ensemble Learningmentioning
confidence: 99%
“…An algorithm based on the concept of ensemble learning has been proposed in [14]. Ensemble learning is a general learning concept whose key component is to create good classifiers by learning several (weak) classifiers and combining them.…”
Section: Community Detection and Ensemble Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…In 2009, Clara Pizzuti proposed a multi-objective genetic algorithm to uncover community structure in complex network which generated a set of network divisions at different hierarchical levels in which solutions at deeper levels [8]. In 2010, Michael Ovelgönne, Andreas Geyer-Schulz presented a fast randomized greedy algorithm which uses solely local information on gradients of the objective function [9]. Most of the current algorithms have a high computational complexity that makes them unsuitable for large scale networks.…”
Section: Introductionmentioning
confidence: 99%