2019
DOI: 10.1088/1742-5468/ab39d6
|View full text |Cite
|
Sign up to set email alerts
|

The scaling limit of high-dimensional online independent component analysis

Abstract: We analyze the dynamics of an online algorithm for independent component analysis in the high-dimensional scaling limit. As the ambient dimension tends to infinity, and with proper time scaling, we show that the time-varying joint empirical measure of the target feature vector and the estimates provided by the algorithm will converge weakly to a deterministic measured-valued process that can be characterized as the unique solution of a nonlinear PDE. Numerical solutions of this PDE, which involves two spatial … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…The themes of ICA and tensor decomposition have been studied in numerous statistics and signal processing literature (Bach and Jordan, 2002;Chen and Bickel, 2006;Samworth and Yuan, 2012;Bonhomme and Robin, 2009;Eriksson and Koivunen, 2004;Hallin and Mehta, 2015;Hyvärinen et al, 2001;Hyvärinen and Oja, 1997;Hyvarinen, 1999;Hyvärinen and Oja, 2000;Ilmonen and Paindaveine, 2011;Kollo, 2008;Miettinen et al, 2015;Tichavsky et al, 2006;Wang and Lu, 2017;Ge and Ma, 2017). For a treatment from a spectral learning perspective (mainly for the deterministic scenario), we refer the readers to the recently published monograph Janzamin et al ( 2019) and the bibliography therein.…”
Section: More Related Literaturementioning
confidence: 99%
See 1 more Smart Citation
“…The themes of ICA and tensor decomposition have been studied in numerous statistics and signal processing literature (Bach and Jordan, 2002;Chen and Bickel, 2006;Samworth and Yuan, 2012;Bonhomme and Robin, 2009;Eriksson and Koivunen, 2004;Hallin and Mehta, 2015;Hyvärinen et al, 2001;Hyvärinen and Oja, 1997;Hyvarinen, 1999;Hyvärinen and Oja, 2000;Ilmonen and Paindaveine, 2011;Kollo, 2008;Miettinen et al, 2015;Tichavsky et al, 2006;Wang and Lu, 2017;Ge and Ma, 2017). For a treatment from a spectral learning perspective (mainly for the deterministic scenario), we refer the readers to the recently published monograph Janzamin et al ( 2019) and the bibliography therein.…”
Section: More Related Literaturementioning
confidence: 99%
“…Algorithms that find local minimizers of (1.1) allow us to estimate the columns of the mixing matrix A. We analyze and discuss the following stochastic approximation method, which we refer to as online tensorial ICA (Ge et al, 2015;Wang and Lu, 2017). Initialize a unit vector u (0) appropriately, at step t = 1, 2, .…”
Section: Introductionmentioning
confidence: 99%
“…For online dimension reduction tasks, more popular efforts have been focused on addressing online PCA for unsupervised learning on streaming data settings in literature (Warmuth and Kuzmin, 2008;Arora et al, 2013Arora et al, , 2012aMitliagkas et al, 2013;Feng et al, 2013), while there are also a few studies for online ICA (Li et al, 2016b;Wang and Lu, 2017). For nonlinear space learning methods, online Kernel-PCA has also received some research interests (Kuzmin and Warmuth, 2007;Honeine, 2012).…”
Section: Online Dimension Reductionmentioning
confidence: 99%
“…For SGD with constant learning rate, there has been recent progress on quantifying the dimension dependence of the sample complexity for various tasks on general (pseudo or quasi‐) convex objectives [14, 15, 24, 33, 53, 68] and special classes of non‐convex objectives [6, 31, 71]. There has also been important work on scaling limits as the dimension tends to infinity for the specific problems of linear regression [55, 76], Online PCA [41, 76], and phase retrieval [71] from random starts, and teacher‐student networks [32, 64, 65, 73] and two‐layer networks for XOR Gaussian mixtures [60] from warm starts. We also note that the study of high‐dimensional regimes of gradient descent and Langevin dynamics have a history from the statistical physics perspective, for example, in [17, 21, 22, 45, 48, 67].…”
Section: Introductionmentioning
confidence: 99%