2012 IEEE International Workshop on Machine Learning for Signal Processing 2012
DOI: 10.1109/mlsp.2012.6349786
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear data description with Principal Polynomial Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 16 publications
0
8
0
Order By: Relevance
“…In this work, we propose to estimate the conditional mean at each step of the sequence using a polynomial function with coefficients w pij and degree γ p . Hence, the estimation problem becomes: 21 wp 22 · · · wp 2(γp +1) wp 31 wp 32 · · · wp 3(γp +1) . .…”
Section: The Extension: Principal Polynomial Analysismentioning
confidence: 99%
“…In this work, we propose to estimate the conditional mean at each step of the sequence using a polynomial function with coefficients w pij and degree γ p . Hence, the estimation problem becomes: 21 wp 22 · · · wp 2(γp +1) wp 31 wp 32 · · · wp 3(γp +1) . .…”
Section: The Extension: Principal Polynomial Analysismentioning
confidence: 99%
“…On the one hand, V p is generated from the data. On the other hand, any method to compute an orthogonal complement from e p is fine to obtain E p since the reconstruction error does not depend on the selected basis [8,9]. According to this, the number of elements in the side information corresponding to each elementary transform R p is: (γ + 1) × (d − p) (from the Wp's) plus d−p (from the ep's).…”
Section: Inverse and Side Informationmentioning
confidence: 99%
“…In this paper, we analyze the lossless coding efficiency of an invertible non-linear generalization of PCA, the Principal Polynomial Analysis (PPA) [8,9], originally proposed for dimensionality reduction. PPA is a deflationary algorithm based on drawing a sequence of Principal Curves that address one dimension at a time [10].…”
Section: Introductionmentioning
confidence: 99%
“…Both extreme cases are undesirable because of different reasons: limited performance (in too rigid methods), and complex tuning of free parameters and/or unaffordable computational cost (in too flexible methods). In this projection-onto-explicit-features context, autoencoders such as Nonlinear-PCA (NLPCA) [23], and approaches based on fitting functional curves, such as Principal Polynomial Analysis (PPA) [34,35], represent convenient intermediate points between the extreme cases in the family. Note that these methods have shown better performance than PCA on a variety of real data [35,36].…”
Section: Introductionmentioning
confidence: 99%
“…In Section 4, we address two important high dimensional problems in remote sensing: the estimation of atmospheric state vectors from Infrared Atmospheric Sounding Interferometer (IASI) hyperspectral sounding data, and the dimensionality reduction and classification of spatio-spectral Landsat image patches. In the experiments, DRR is compared with conventional PCA [26], and with recent fast nonlinear generalizations that belong to the same class of invertible transforms, PPA [34,35] and NLPCA [23]. Comparisons are made both in terms of reconstruction error and of expressive power of the extracted features.…”
Section: Introductionmentioning
confidence: 99%