2000
DOI: 10.1201/9781420036121
|View full text |Cite
|
Sign up to set email alerts
|

Multidimensional Scaling, Second Edition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
681
0
32

Year Published

2004
2004
2019
2019

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 936 publications
(714 citation statements)
references
References 0 publications
1
681
0
32
Order By: Relevance
“…We study two important scenarios: (i) average-case landscape of the variety of clustering algorithms over a number of realworld data sets, and (ii) a landscape over artificial data sets generated by mixtures of Gaussian components. In both cases multidimensional scaling [14] is employed to visualize the landscape. In the case of controlled artificial data sets, we also obtain a dynamic trace of the changes in the landscape caused by varying the density and isolation of clusters.…”
Section: Introductionmentioning
confidence: 99%
“…We study two important scenarios: (i) average-case landscape of the variety of clustering algorithms over a number of realworld data sets, and (ii) a landscape over artificial data sets generated by mixtures of Gaussian components. In both cases multidimensional scaling [14] is employed to visualize the landscape. In the case of controlled artificial data sets, we also obtain a dynamic trace of the changes in the landscape caused by varying the density and isolation of clusters.…”
Section: Introductionmentioning
confidence: 99%
“…Combinations of these measures are also of interest, for example a 11 + a 12 + a 21 = |X ∪ Y|. Based on these four counts, a 11 , a 22 , a 12 , a 21 , one can derive a number of different similarity coefficients as shown in table 1, (Cox and Cox, 2000).…”
Section: Similarity Measuresmentioning
confidence: 99%
“…We refer the reader to the books by Borg and Groenen (1997) and Cox and Cox (2000) for a thorough treatment of the subject. Formally, let Δ = {δ ij } be an r × r matrix representing the pair-wise dissimilarity (or distance) between r points.…”
Section: Multidimensional Scalingmentioning
confidence: 99%
See 1 more Smart Citation
“…The classical example is Principal Component Analysis (PCA) [1] but in recent years methods based on manifold learning such as Multidimensional Scaling [2], Local Linear Embedding [3], Isomap [4], Laplacian Eigenmaps [5] or Hessian Eigenmaps [6] have received a great deal of attention. The common assumption in these methods is that sample data lie in a low dimensional manifold and their goal is to identify the metric on the underlying manifolds from which a suitable low dimensional representation is derived that allows to adequately approximate the original manifold metric with the natural one in the low dimensional representation.…”
Section: Introductionmentioning
confidence: 99%