2021
DOI: 10.1088/1742-6596/1955/1/012078
|View full text |Cite
|
Sign up to set email alerts
|

Link Prediction Algorithm Based on Node Structure Similarity Measured by Relative Entropy

Abstract: To solve the problem that the link prediction method based on local information ignores the influence of neighbor structure information on the similarity measurement of nodes, a link prediction method based on relative entropy and local structure of nodes is proposed. Firstly, the second-order local network is introduced to describe the local structure of nodes; then, the structural similarity between nodes is described by redefining the relative entropy; finally, the structural similarity of nodes is measured… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…The information entropy method can be used to quantify the link prediction problems based on a probability description. There are several entropy weight methods that have been proposed for link prediction research, such as the node similarity index of path entropy [38], structural entropy model [39], link prediction method based on relative entropy [40], and maximum entropy model [41]. The calculation process of information entropy is expressed as Equation (3).…”
Section: Information Entropymentioning
confidence: 99%
“…The information entropy method can be used to quantify the link prediction problems based on a probability description. There are several entropy weight methods that have been proposed for link prediction research, such as the node similarity index of path entropy [38], structural entropy model [39], link prediction method based on relative entropy [40], and maximum entropy model [41]. The calculation process of information entropy is expressed as Equation (3).…”
Section: Information Entropymentioning
confidence: 99%