2018
DOI: 10.1016/j.knosys.2018.02.028
|View full text |Cite
|
Sign up to set email alerts
|

Community aware random walk for network embedding

Abstract: Social network analysis provides meaningful information about behavior of network members that can be used for diverse applications such as classification, link prediction. However, network analysis is computationally expensive because of feature learning for different applications. In recent years, many researches have focused on feature learning methods in social networks. Network embedding represents the network in a lower dimensional representation space with the same properties which presents a compressed… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
36
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 77 publications
(36 citation statements)
references
References 20 publications
0
36
0
Order By: Relevance
“…MNMF [10] successfully saves the community structure in the generated node representations and improves the quality of node representation vectors, it preserves the first and second-order similarities of nodes with Non-negative Matrix Factorization(NMF), models community structure with modularity maximization. CARE [11] uses community attributes to capture more network structure information. It designs a community-aware random walk strategy and uses the Skip-gram model to learn the node representations.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…MNMF [10] successfully saves the community structure in the generated node representations and improves the quality of node representation vectors, it preserves the first and second-order similarities of nodes with Non-negative Matrix Factorization(NMF), models community structure with modularity maximization. CARE [11] uses community attributes to capture more network structure information. It designs a community-aware random walk strategy and uses the Skip-gram model to learn the node representations.…”
Section: Related Workmentioning
confidence: 99%
“…For example, MNMF [10] preserves the community structure of the network while considering the proximity of nodes. CARE [11] designs the community-aware random walks when learning the node representation vectors. Based on Expansion Sampling(XS) strategy [12] [13] [14], SENE [15] saves the mesoscopic community information in the learned node representation vectors.…”
Section: Introductionmentioning
confidence: 99%
“…The Hadamard operator is the best operator for edge feature learning from user feature vectors [24], [26]. An edge feature vector indicates that the source and destination nodes of a link are similar based on their structural and content features.…”
Section: Figure 3-content Feature Vector Learning In Deeplinkmentioning
confidence: 99%
“…In machine learning applications, the traditional single label classification (SLC) problem has been explored substantially. However, more recently, the multi-label classification (MLC) problem has attracted increasing research interest because of its wide range of applications, such as text classification [ 1 , 2 ], social network analysis [ 3 ], gene function classification [ 4 ], and image/video annotation [ 5 ]. With SLC, one instance only belongs to one category, whereas with MLC, it can be allocated to multiple categories simultaneously.…”
Section: Introductionmentioning
confidence: 99%