2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2016
DOI: 10.1109/icassp.2016.7472899
|View full text |Cite
|
Sign up to set email alerts
|

Generalized Laplacian precision matrix estimation for graph signal processing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
82
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 97 publications
(82 citation statements)
references
References 20 publications
0
82
0
Order By: Relevance
“…Under this model, the sampled covariance matrix of {y } L =1 is low rank with at most rank R. As mentioned, under such setting it is difficult to reconstruct L from {y } L =1 using the existing methods [15][16][17][18][19][20]. Before discussing the proposed methods for inferring communities from {y } L =1 in Section 4 and 5, let us justify the model (11), (12) with three motivating examples.…”
Section: Graph Signal Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…Under this model, the sampled covariance matrix of {y } L =1 is low rank with at most rank R. As mentioned, under such setting it is difficult to reconstruct L from {y } L =1 using the existing methods [15][16][17][18][19][20]. Before discussing the proposed methods for inferring communities from {y } L =1 in Section 4 and 5, let us justify the model (11), (12) with three motivating examples.…”
Section: Graph Signal Modelmentioning
confidence: 99%
“…where α ∈ (0, 1) is the speed of the diffusion process. As (I − αL) T is a polynomial of the graph's Laplacian, we observe that y is an output of a graph filter (11). On the other hand, the excitation signal x may model the changes in temperature in the region due to a weather condition.…”
Section: Example 1: Diffusion Dynamicsmentioning
confidence: 99%
See 1 more Smart Citation
“…of identifying the graph underlying the observed data values according to given criteria, such as graph smoothness or graph sparsity [1], [2]. In fact, except for specific SoG applications where the data are univocally associated to an underlying graph structure -e.g.…”
Section: Introductionmentioning
confidence: 99%
“…Model-based graph learning has been recently analyzed in [1], where the authors resort to a parametric smooth signal model and develop a procedure for learning the graph Laplacian matrix via an alternating minimization algorithm jointly enforcing the SoG smoothness and the Laplacian properties of sparsity and semidefinite positiveness. In [2], the authors propose an iterative Laplacian matrix learning algorithm that, stemming on the knowledge of which edges are active, updates one row/column of the precision matrix at a time by solving a non-negative quadratic program and superimposing prior constraints on the Laplacian matrix structure.…”
Section: Introductionmentioning
confidence: 99%