2005
DOI: 10.1007/11492542_16
|View full text |Cite
|
Sign up to set email alerts
|

Graph Clustering Using Heat Content Invariants

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2010
2010
2020
2020

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 9 publications
0
3
0
Order By: Relevance
“…Heat Content Invariant From the graph Laplacian and degree matrices (Equations ( A24 ) and ( A23 ), respectively), the normalised Laplacian matrix can be evaluated as: The spectral decomposition of reads as: where is a diagonal matrix containing the eigenvalues in increasing order and contains the corresponding unitary-length eigenvectors. The heat equation associated to is given by [ 124 , 125 ]: where is the heat kernel matrix at time t . The heat content of is given by: where is the value related to node in the k th eigenvector.…”
Section: Appendix A1 Betti Numbersmentioning
confidence: 99%
See 1 more Smart Citation
“…Heat Content Invariant From the graph Laplacian and degree matrices (Equations ( A24 ) and ( A23 ), respectively), the normalised Laplacian matrix can be evaluated as: The spectral decomposition of reads as: where is a diagonal matrix containing the eigenvalues in increasing order and contains the corresponding unitary-length eigenvectors. The heat equation associated to is given by [ 124 , 125 ]: where is the heat kernel matrix at time t . The heat content of is given by: where is the value related to node in the k th eigenvector.…”
Section: Appendix A1 Betti Numbersmentioning
confidence: 99%
“…The heat equation associated to is given by [ 124 , 125 ]: where is the heat kernel matrix at time t . The heat content of is given by: where is the value related to node in the k th eigenvector.…”
Section: Appendix A1 Betti Numbersmentioning
confidence: 99%
“…A first strategy consists in engineering numerical features to be drawn from the structured data at hand, to be concatenated in a vector form. Examples of feature engineering techniques involve entropy measures (Han et al, 2011;Ye et al, 2014;Bai et al, 2012), centrality measures (Mizui et al, 2017;Martino et al, 2018b;Leone Sciabolazza and Riccetti, 2020;Martino et al, 2020a), heat trace (Xiao and Hancock, 2005;Xiao et al, 2009) and modularity (Li, 2013). Whilst this approach is straightforward and allows to move the pattern recognition problem towards the Euclidean space in which any pattern recognition algorithm can be used without alterations, designing the mapping function (i.e., enumerating the set of numerical features to be extracted) requires a deep knowledge of both the problem and the data at hand: indeed, the input spaces being equal, specific subsets of features allow to solve different problems.…”
Section: Current Approaches For Pattern Recognition On the Graph Domainmentioning
confidence: 99%