2001
DOI: 10.1006/jmva.2000.1920
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Variable Screening for Multivariate Analysis

Abstract: It is shown how known algorithms for the comparison of all variables subsets in regression analysis can be adapted to subset comparisons in multivariate analysis, according to any index based on Wilks, Lawley Hotelling, or Bartllet Pillai statistics and, in some special cases, according to any function of the sample squared canonical correlations. The issues regarding the choice of an appropriate comparison criterion are discussed. The computational effort of the proposed algorithms is studied, and it is argue… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0
1

Year Published

2002
2002
2021
2021

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(5 citation statements)
references
References 21 publications
0
4
0
1
Order By: Relevance
“…We have followed the guidelines of Cadima and colleagues (2004) and used the GCD criterion. Given the small size of data set, it is advisable to use the Furnival‐Wilson‐type algorithm (Duarte Silva 2001, 2002), which ensures that the solution produced is optimal. Note that as the number ( k ) of variables selected increases, so does the percentage of variance explained.…”
Section: Application Of Proposed Approach To Waste Electrical and Elementioning
confidence: 99%
“…We have followed the guidelines of Cadima and colleagues (2004) and used the GCD criterion. Given the small size of data set, it is advisable to use the Furnival‐Wilson‐type algorithm (Duarte Silva 2001, 2002), which ensures that the solution produced is optimal. Note that as the number ( k ) of variables selected increases, so does the percentage of variance explained.…”
Section: Application Of Proposed Approach To Waste Electrical and Elementioning
confidence: 99%
“…A consolidated approach to input selection in linear regression problems is to use Branch and Bound algorithms, such as Leaps and Bounds (Furnival & Wilson, 1974) or its more recent variants (Duarte Silva, 2001, 2002). These algorithms are conceived to balance goodness‐of‐fit with model simplicity.…”
Section: Reconstruction Frameworkmentioning
confidence: 99%
“…A consolidated approach to input selection in linear regression problems is to use Branch and Bound algorithms, such as Leaps and Bounds (Furnival & Wilson, 1974) or its more recent variants (Duarte Silva, 2001. These algorithms are conceived to balance goodness-of-fit with model simplicity.…”
Section: Optimal Input Selectionmentioning
confidence: 99%