1938
DOI: 10.1111/j.1469-1809.1938.tb02189.x
|View full text |Cite
|
Sign up to set email alerts
|

The Statistical Utilization of Multiple Measurements

Abstract: The articles published by the Annals of Eugenics (1925–1954) have been made available online as an historical archive intended for scholarly use. The work of eugenicists was often pervaded by prejudice against racial, ethnic and disabled groups. The online publication of this material for scholarly research purposes is not an endorsement of those views nor a promotion of eugenics in any way.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
339
0
13

Year Published

1961
1961
2014
2014

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 720 publications
(353 citation statements)
references
References 7 publications
1
339
0
13
Order By: Relevance
“…Two classical problems with the definition of these metrics are the selection of an appropriate pdf that can estimate the true underlying density of the data, and the homoscedasticity (i.e., same variance) assumption. For instance, if every class is defined by a single multimodal Normal distribution with common covariance matrix, then the nearest-mean classifier provides the Bayes optimal classification boundary in the subspace defined by linear discriminant analysis (LDA) (33).…”
Section: Significancementioning
confidence: 99%
“…Two classical problems with the definition of these metrics are the selection of an appropriate pdf that can estimate the true underlying density of the data, and the homoscedasticity (i.e., same variance) assumption. For instance, if every class is defined by a single multimodal Normal distribution with common covariance matrix, then the nearest-mean classifier provides the Bayes optimal classification boundary in the subspace defined by linear discriminant analysis (LDA) (33).…”
Section: Significancementioning
confidence: 99%
“…LDA selects those basis vectors that maximize the distance between the means of each class and minimize the distance between the samples in each class and their corresponding class means [8]. This can facilitate the task of feature extraction in some applications.…”
Section: Learning Linear Subspace Representationsmentioning
confidence: 99%
“…A raw version of this data set can be found in our web page, http://www.imse.lsu.edu/vangelis. We analyzed these data by using Fisher's linear discriminant analysis (29)(30)(31). By using linear discriminant analysis (LDA) one can estimate the line that minimizes the misclassification probability (given that this linear combination of the features follow a normal distribution and the classes have the same variance-covariance matrix).…”
Section: Figmentioning
confidence: 99%