2012 IEEE Statistical Signal Processing Workshop (SSP) 2012
DOI: 10.1109/ssp.2012.6319849
|View full text |Cite
|
Sign up to set email alerts
|

Kronecker graphical lasso

Abstract: We consider high-dimensional estimation of a (possibly sparse) Kronecker-decomposable covariance matrix given i.i.d. Gaussian samples. We propose a sparse covariance estimation algorithm, Kronecker Graphical Lasso (KGlasso), for the high dimensional setting that takes advantage of structure and sparsity. Convergence and limit point characterization of this iterative algorithm is established. Compared to standard Glasso, KGlasso has low computational complexity as the dimension of the covariance matrix increase… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

2
1
0

Year Published

2013
2013
2018
2018

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 10 publications
2
1
0
Order By: Relevance
“…of unknown parameters. This result extends the recent high-dimensional results obtained in [2], [3], [43] for the single Kronecker product model (i.e., r = 1).…”
Section: A High Dimensional Operator Norm Bound For the Permuted Samsupporting
confidence: 89%
See 1 more Smart Citation
“…of unknown parameters. This result extends the recent high-dimensional results obtained in [2], [3], [43] for the single Kronecker product model (i.e., r = 1).…”
Section: A High Dimensional Operator Norm Bound For the Permuted Samsupporting
confidence: 89%
“…This asymptotic MSE convergence rate of the estimated covariance to the true covariance reflects the number of degrees of freedom of the model, which is on the order of the total number r(p 2 + q 2 ) of unknown parameters. This result extends the recent high-dimensional results obtained in [2], [3], [43] for the single Kronecker product model (i.e., r = 1).…”
Section: B High Dimensional Mse Convergence Rate For Prlssupporting
confidence: 88%
“…Several authors have proposed and studied penalized likelihood estimators of Φ * and ∆ * when J = 1 (Allen and Tibshirani, 2010;Zhang and Schneider, 2010;Tsiligkaridis et al, 2012;Leng and Tang, 2012;Zhou, 2014).…”
Section: Introductionmentioning
confidence: 99%