2013
DOI: 10.1109/msp.2013.2250591
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Multivariate Analysis Framework for Supervised Subspace Learning: A Tutorial on Linear and Kernel Multivariate Methods

Abstract: Feature extraction and dimensionality reduction are important tasks in many fields of science dealing with signal processing and analysis. The relevance of these techniques is increasing as current sensory devices are developed with ever higher resolution, and problems involving multimodal data sources become more common. A plethora of feature extraction methods are available in the literature collectively grouped under the field of Multivariate Analysis (MVA). This paper provides a uniform treatment of severa… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
97
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 101 publications
(97 citation statements)
references
References 36 publications
0
97
0
Order By: Relevance
“…The scenario calls for the concept of regularization, which is tightly related to invariance encoding and incorporation of prior knowledge and the definition of sensible cost functions. Many opportunities appear here to improve the performance of emulators: one could think of including multiple pieces of information in the regression algorithm with multimodal/multiresolution regression, e.g., by combining RTMs for the same problem, to accommodate spatial or temporal relations in the emulation [44,94], and to implement better dimensionality reduction techniques beyond linear PCA to deal with the multi-output problem [95]. Apart from these improvements in the regression algorithm, we raise here the important issue of assessment of the emulator function, e.g., by looking at the Jacobian and Hessian of the transformation [38,96], Bayesian sensitivity analysis [34,97], as well as developing emulators that may deal with coupled RTMs and transformations of coefficients [50].…”
Section: New Processing Opportunities With Emulatorsmentioning
confidence: 99%
“…The scenario calls for the concept of regularization, which is tightly related to invariance encoding and incorporation of prior knowledge and the definition of sensible cost functions. Many opportunities appear here to improve the performance of emulators: one could think of including multiple pieces of information in the regression algorithm with multimodal/multiresolution regression, e.g., by combining RTMs for the same problem, to accommodate spatial or temporal relations in the emulation [44,94], and to implement better dimensionality reduction techniques beyond linear PCA to deal with the multi-output problem [95]. Apart from these improvements in the regression algorithm, we raise here the important issue of assessment of the emulator function, e.g., by looking at the Jacobian and Hessian of the transformation [38,96], Bayesian sensitivity analysis [34,97], as well as developing emulators that may deal with coupled RTMs and transformations of coefficients [50].…”
Section: New Processing Opportunities With Emulatorsmentioning
confidence: 99%
“…A common approach in statistics to alleviate these problems considers first reducing data dimensionality and then applying the OLS normal equations to the projected data or scores [38]. These scores reduce to a linear transformation of the original data, X = XU.…”
Section: Partial Least Squares Regressionmentioning
confidence: 99%
“…OPLS is a multivariate analysis method for feature extraction, which exploits the correlation between the features and the target data by combining the merits of canonical variate analysis and PLS [28,31,32]. Given a set of training samples {X,…”
Section: Orthonormalized Partial Least Square (Opls)mentioning
confidence: 99%
“…OPLS is a variant of PLS, which is applicable to supervised problems, with certain optimality conditions regarding PLS. Moreover, considering that OPLS projections are obtained to predict the output labels, in consequence much more discriminative projection vectors are extracted compared to LDA, PLS [31,32].…”
Section: Introductionmentioning
confidence: 99%