2019
DOI: 10.1007/s42452-019-1044-9
|View full text |Cite
|
Sign up to set email alerts
|

Network representation learning: models, methods and applications

Abstract: With the rise of large-scale social networks, network mining has become an important sub-domain of data mining. Generating an efficient network representation is one important challenge in applying machine learning to network data. Recently, representation learning methods are widely used in various domains to generate low dimensional latent features from complex high dimensional data. A significant amount of research effort is made in the past few years to generate node representations from graph-structured d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 115 publications
0
8
0
Order By: Relevance
“…As a machine learning paradigm, incremental learning takes place whenever a change in an information network is detected, which means the need to adjust what was learned by the current model in accordance with the new arrived or removed data. 68 In incremental learning, there is not sufficient training data as they appear over time, which means a partial update in the learning model. Recent incremental learning methods include matrix factorization models (e.g., ISVD, 69 IRMF 70 ), metapath-based models (e.g., metapath2vec, 71 LIME 72 ), and GNN model.…”
Section: Design Of Incremental Multi-view Embeddingmentioning
confidence: 99%
See 1 more Smart Citation
“…As a machine learning paradigm, incremental learning takes place whenever a change in an information network is detected, which means the need to adjust what was learned by the current model in accordance with the new arrived or removed data. 68 In incremental learning, there is not sufficient training data as they appear over time, which means a partial update in the learning model. Recent incremental learning methods include matrix factorization models (e.g., ISVD, 69 IRMF 70 ), metapath-based models (e.g., metapath2vec, 71 LIME 72 ), and GNN model.…”
Section: Design Of Incremental Multi-view Embeddingmentioning
confidence: 99%
“…Aiming to solve the above challenges, incremental learning 68 is applied, as a solution to help autonomic managers ignore obsolete knowledge (e.g., useless management policies) and equip the BSKG with updated knowledge (e.g., newly added management policies, latest QoS levels, new substituting services, etc.). Next, we start by modeling the captured changes on the managed BSKG.…”
Section: Big Service Knowledge Graphmentioning
confidence: 99%
“…There have been various studies for learning representations of networks and entities in the networks (e.g., nodes, edges, substructures, etc.) [28][29][30]. Also, there have been various network embedding methods [31][32][33][34] that exhibited the state-of-the-art performance.…”
Section: The Character Network Is a Representation Of Relationships Bmentioning
confidence: 99%
“…Network representation learning aims to learn the representations of network nodes so that these learned representations can be expressed as latent, informative, and low-dimensional vectors while preserving the network topology, node features, labels, and other auxiliary information [24,31]. These generated vectors can support a wide range of downstream tasks most effectively, such as node classification [15,21,42], link prediction [4,48] and recommendation [43].…”
Section: Introductionmentioning
confidence: 99%