2018
DOI: 10.1016/j.physa.2018.01.002
|View full text |Cite
|
Sign up to set email alerts
|

Mixture models with entropy regularization for community detection in networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 36 publications
0
5
0
Order By: Relevance
“…In external evaluation, the comparison algorithm is AGDL 29 , Fluid 30 , Belief 31 , CPM 32 , Chinese Whispers 33 , DER 34 , Eigenvector 35 , EM 36 , Genetics Algorithm 14 , Girvan Newman 12 , Greedy Modularity 13 , Kcut 37 , Label Propagation 38 , Leiden 11 , Louvain 10 , Markov Clustering 39 , RBER Pots, RB Pots 40 [33], Significance 41 , Spinglass 40 , Surprise 42 ,Walktrap 43 , Head tail 44 , LSWL+ 45 , Paris , dan Regularized spectral 46 using the library CDLIB 47,48 dan Networkx 49 .…”
Section: Discussionmentioning
confidence: 99%
“…In external evaluation, the comparison algorithm is AGDL 29 , Fluid 30 , Belief 31 , CPM 32 , Chinese Whispers 33 , DER 34 , Eigenvector 35 , EM 36 , Genetics Algorithm 14 , Girvan Newman 12 , Greedy Modularity 13 , Kcut 37 , Label Propagation 38 , Leiden 11 , Louvain 10 , Markov Clustering 39 , RBER Pots, RB Pots 40 [33], Significance 41 , Spinglass 40 , Surprise 42 ,Walktrap 43 , Head tail 44 , LSWL+ 45 , Paris , dan Regularized spectral 46 using the library CDLIB 47,48 dan Networkx 49 .…”
Section: Discussionmentioning
confidence: 99%
“…Values close to 1 indicate a strong correlation between two variables, whereas values close to 0 indicate a weak correlation. For the external evaluation, the comparison algorithms were AGDL 31 , Fluid 32 , Belief 33 , CPM 34 , Chinese Whispers 35 , DER 36 , Eigenvector 37 , EM 38 , Genetics Algorithm 16 , Girvan Newman 14 , Greedy Modularity 15 , Kcut 39 , Label Propagation 40 , Leiden 13 , Louvain 12 , Markov Clustering 41 , RBER Pots, RB Pots 42 [33], Significance 43 , Spinglass 42 , Surprise 44 , Walktrap 45 , Head tail 46 , LSWL+ 47 , Paris, dan Regularized spectral 48 using the library CDLIB 49 , 50 dan Networkx 51 .…”
Section: Methodsmentioning
confidence: 99%
“…It also reflects the complexity of the sequence, and its value is proportional to the complexity. Compared with approximate entropy, sample entropy has the following advantages: (1) it does not depend on the length of the data; (2) it has better consistency [ 27 ]. The sample entropy can be represented as: …”
Section: Methodsmentioning
confidence: 99%