2017
DOI: 10.1016/j.acha.2016.11.002
|View full text |Cite
|
Sign up to set email alerts
|

A note on Markov normalized magnetic eigenmaps

Abstract: We note that building a magnetic Laplacian from the Markov transition matrix, rather than the graph adjacency matrix, yields several benefits for the magnetic eigenmaps algorithm. The two largest benefits are that the embedding becomes more stable as a function of the rotation parameter g, and the principal eigenvector of the magnetic Laplacian now converges to the page rank of the network as a function of diffusion time. We show empirically that this normalization improves the phase and real/imaginary embeddi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 3 publications
0
4
0
Order By: Relevance
“…[5,32,35,40] But thanks to its Hermitian properties, the magnetic Laplacian has been utilized in directed graph studies. [14,34] Some application studies such as graph clustering [4,6] node [54] and graph representation learning [12] are delivered. MagNet [54] proved the positive semidefinite property of the magnetic Laplacian and proposed a spectral graph convolution.…”
Section: Magnetic Laplacianmentioning
confidence: 99%
“…[5,32,35,40] But thanks to its Hermitian properties, the magnetic Laplacian has been utilized in directed graph studies. [14,34] Some application studies such as graph clustering [4,6] node [54] and graph representation learning [12] are delivered. MagNet [54] proved the positive semidefinite property of the magnetic Laplacian and proposed a spectral graph convolution.…”
Section: Magnetic Laplacianmentioning
confidence: 99%
“…This matrix represents the undirected geometry of the graph in the magnitude of its entries and incorporates directional information in the phases. It has been studied by the graph signal processing community [30] and also applied to numerous data science applications such as clustering and community detection [27,16,25,26]. Recently, [76] showed that the Magnetic Laplacian could be effectively incorporated into a graph neural network.…”
Section: Signed Graphs Directed Graphs and Hypergraphsmentioning
confidence: 99%
“…However, a user may choose to also tune q through a standard cross-validation procedure as in [13]. Moreover, one can readily address the latter issue by normalizing the adjacency matrix via a preprocessing step (e.g., [30]). In contrast to our magnetic signed Laplacian, in the case where the graph is not signed but is weighted and directed, the matrix proposed in [29] does not reduce to the magnetic Laplacian considered in [13].…”
Section: Related Workmentioning
confidence: 99%