2013
DOI: 10.1007/s10994-013-5331-1
|View full text |Cite
|
Sign up to set email alerts
|

Sparse non Gaussian component analysis by semidefinite programming

Abstract: Sparse non-Gaussian component analysis is an unsupervised linear method of extracting any structure from high-dimensional distributed data based on estimating a lowdimensional non-Gaussian data component. In this paper we discuss a new approach with known apriori reduced dimension to direct estimation of the projector on the target space using semidefinite programming. The new approach avoids the estimation of the data covariance matrix and overcomes the traditional separation of element estimation of the targ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 25 publications
0
9
0
Order By: Relevance
“…Finally, we illustrate our test and present a sequential procedure that assesses the rank of a covariance operator. The problem of covariance rank estimation is adressed in several domains: functional regression [9,7], classification [41] and dimension reduction methods such as PCA, Kernel PCA and Non-Gaussian Component Analysis [3,12,13] where the dimension of the kept subspace is a crucial problem. 3 Here is the outline of the paper.…”
Section: Introductionmentioning
confidence: 99%
“…Finally, we illustrate our test and present a sequential procedure that assesses the rank of a covariance operator. The problem of covariance rank estimation is adressed in several domains: functional regression [9,7], classification [41] and dimension reduction methods such as PCA, Kernel PCA and Non-Gaussian Component Analysis [3,12,13] where the dimension of the kept subspace is a crucial problem. 3 Here is the outline of the paper.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, [BKS + 06] provide an algorithm based on Stein's characterization of the Gaussian random variable and [DJNS13] use semi-definite programming. We remark that while the NGCA problem resembles ICA problem, the algorithms for the latter do not seem to be directly applicable to the former.…”
Section: Related Workmentioning
confidence: 99%
“…Instead, the loadings matrix (the matrix which performs the dimension reduction) can be calculated using a smaller, representative dataset, and then the transformation quickly calculated for the rest of the data. PCA was introduced for Gaussian distributed data and has been extended significantly over the years to address different research problems (see for example Tipping and Bishop (1999), Collins et al (2002), Ding andHe (2004), de Leeuw (2006) and Diederichs et al (2013) among others). The need to develop a generalised version of PCA to address application to the exponential family of distributions (which encompasses a wide variety of models for real-world data) was recognised in Landgraf and Lee (2015) who developed Generalised Principal Component Analysis (GPCA).…”
Section: Introductionmentioning
confidence: 99%