2015
DOI: 10.1002/widm.1147
|View full text |Cite
|
Sign up to set email alerts
|

Data visualization by nonlinear dimensionality reduction

Abstract: In this overview, commonly used dimensionality reduction techniques for data visualization and their properties are reviewed. Thereby, the focus lies on an intuitive understanding of the underlying mathematical principles rather than detailed algorithmic pipelines. Important mathematical properties of the technologies are summarized in the tabular form. The behavior of representative techniques is demonstrated for three benchmarks, followed by a short discussion on how to quantitatively evaluate these mappings… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
42
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
4
3

Relationship

4
3

Authors

Journals

citations
Cited by 65 publications
(42 citation statements)
references
References 70 publications
0
42
0
Order By: Relevance
“…Dimensionality reduction is a key requirement of big data visualisation (Zou et al 2016) because many of the large datasets that exist today contain multiple dimensions, however, humans can only perceive three dimensions, giving rise to the need for algorithms that can reduce datasets to two or three dimensions which can be visualised. There are a number of statistical algorithms which can achieve this including Principal Component Analysis (PCA), t-distributed stochastic neighbour embedding (t-SNE) and diffusion maps (Agrawal et al 2015;Fernandez et al 2015;Genender-Feltheimer 2018;Gisbrecht and Hammer 2015;Shirota et al 2017). Mapping multidimensional datasets into clusters that are represented in two or three dimensions is also common and allows the partitioning of data into similar groups (Keck et al 2017).…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…Dimensionality reduction is a key requirement of big data visualisation (Zou et al 2016) because many of the large datasets that exist today contain multiple dimensions, however, humans can only perceive three dimensions, giving rise to the need for algorithms that can reduce datasets to two or three dimensions which can be visualised. There are a number of statistical algorithms which can achieve this including Principal Component Analysis (PCA), t-distributed stochastic neighbour embedding (t-SNE) and diffusion maps (Agrawal et al 2015;Fernandez et al 2015;Genender-Feltheimer 2018;Gisbrecht and Hammer 2015;Shirota et al 2017). Mapping multidimensional datasets into clusters that are represented in two or three dimensions is also common and allows the partitioning of data into similar groups (Keck et al 2017).…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…[52,18]; dimensionality reduction plays a role as soon as data are visualized, since this requires an embedding of data in two or three dimensions. The challenge is to unravel characteristics from the data, i.e.…”
Section: Representation Learningmentioning
confidence: 99%
“…As described in the recent overview [11] for example, one can distinguish parametric and non-parametric DR techniques.…”
Section: Dimensionality Reductionmentioning
confidence: 99%
“…Besides classical methods such as linear projections offered by principal component analysis or linear discriminant analysis and nonlinear extensions such as the self-organizing map (SOM) or generative topographic mapping (GTM), a variety of (often non-parametric) dimensionality reduction (DR) techniques has been proposed in the last decade, such as tdistributed stochastic neighbor embedding (t-SNE), neighborhood retrieval visualizer (NeRV), or maximum variance unfolding (MVU), see e.g. the articles [39,40,24,43,16,11,24] for overviews on DR techniques. Often, however, these methods are used to visualize a given data set in two dimensions only, not yet answering the question how to visualize the relation of these data in connection to a given classifier.…”
Section: Introductionmentioning
confidence: 99%