2014
DOI: 10.1142/s0129065714400073
|View full text |Cite
|
Sign up to set email alerts
|

Principal Polynomial Analysis

Abstract: This paper presents a new framework for manifold learning based on a sequence of principal polynomials that capture the possibly nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) generalizes PCA by modeling the directions of maximal variance by means of curves, instead of straight lines. Contrarily to previous approaches, PPA reduces to performing simple univariate regressions, which makes it computationally feasible and robust. Moreover, PPA shows a number of interesting analytica… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

1
21
0
5

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 27 publications
(27 citation statements)
references
References 41 publications
1
21
0
5
Order By: Relevance
“…Thus, despite the simplicity and efficiency that linearity provides, non-linear methods could provide performance improvements by better exploiting data dependencies. Recently, several non-linear generalization of PCA have been proposed to deal with data of non-linear nature [4,12,13]. In the adaptation of these non-linear methods to image coding, there are two major considerations: first, the transform must be invertible; and second, the computational complexity and the memory consumption should be reasonable.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, despite the simplicity and efficiency that linearity provides, non-linear methods could provide performance improvements by better exploiting data dependencies. Recently, several non-linear generalization of PCA have been proposed to deal with data of non-linear nature [4,12,13]. In the adaptation of these non-linear methods to image coding, there are two major considerations: first, the transform must be invertible; and second, the computational complexity and the memory consumption should be reasonable.…”
Section: Introductionmentioning
confidence: 99%
“…In [14] we explored lossless hyperspectral image coding using curvilinear techniques based on Principal Polynomial Analysis (PPA) [13]. In that work, it was shown that PPA achieves higher energy compaction and statistical independence than PCA.…”
Section: Introductionmentioning
confidence: 99%
“…In [33], [34], the following families of nonlinear generalizations of feature extraction transforms were reviewed: (i) kernel and spectral techniques such as kernel-PCA, kernel-ICA or Local Linear Embedding (LLE) [35]- [37], (ii) neural networks and autoencoders [38]- [41], and (iii) techniques based on curvilinear features [10], [31], [33], [34], [42]- [44]. In the adaptation of these feature extraction ideas to image coding, there are two major considerations of importance: first, the transform must be invertible; and second, the computational complexity and the memory consumption should be reasonable.…”
mentioning
confidence: 99%
“…In [46] we explored lossless hyperspectral image coding using curvilinear techniques (family iii) based on Principal Polynomial Analysis (PPA) [33]. PPA exploits regression to remove non-linear dependencies that remain after linear feature extraction (e.g., after classical PCA).…”
mentioning
confidence: 99%
See 1 more Smart Citation