2023
DOI: 10.1109/tvcg.2021.3125576
|View full text |Cite
|
Sign up to set email alerts
|

UnProjection: Leveraging Inverse-Projections for Visual Analytics of High-Dimensional Data

Abstract: Projection techniques are often used to visualize high-dimensional data, allowing users to better understand the overall structure of multi-dimensional spaces on a 2D screen. Although many such methods exist, comparably little work has been done on generalizable methods of inverse-projection -the process of mapping the projected points, or more generally, the projection space back to the original high-dimensional space. In this article we present NNInv, a deep learning technique with the ability to approximate… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2024
2024
2025
2025

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 12 publications
(15 citation statements)
references
References 65 publications
0
15
0
Order By: Relevance
“…Geometrically, the projection embeds a hyperplane in input space for the linear case and a hyper‐surface for the non‐linear case. This embedding can be evaluated by inversely projecting 2D points to input space creating a densely sampled representation [EAS*21] with small remaining uncertainty about n D values in‐between samples.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Geometrically, the projection embeds a hyperplane in input space for the linear case and a hyper‐surface for the non‐linear case. This embedding can be evaluated by inversely projecting 2D points to input space creating a densely sampled representation [EAS*21] with small remaining uncertainty about n D values in‐between samples.…”
Section: Methodsmentioning
confidence: 99%
“…We use linear projections to create dense maps that inherently do not suffer from these problems. While linear projections have been considered for this application before [CCWH08], they were dismissed due to their poorer performance in cluster separation [SGH15, ERT19] and possible data point overlap as compared to non‐linear methods [EAS*21]. We show that by providing complementary interactive selection and interpretation tools, this weakness can be alleviated.…”
Section: Related Workmentioning
confidence: 99%
“…Their method enables the exploration of the latent space of graph layouts. More recently, NNP [EHT20,EHFT20,EAS * 21] used deep learning to mimic any projection technique P by training on P ( D ′) for a small subset D ′ ⊂ D. In addition to providing fast computation, such approaches also have the ability to project out‐of‐sample data. Furthermore, due to the neural‐network based approach, inference computations are parameter free.…”
Section: Related Workmentioning
confidence: 99%
“…In this paper, we present HyperNP, a deep learning technique that approximates projections across hyperparameters to enable real‐time exploration of a projection method's hyperparameters. Similar to previous work on neural network projection (NNP) [EHT20,EAS * 21], HyperNP approximates the projection of a single technique (like t‐SNE) on a single dataset with the added benefit of out‐of‐sample projection. Both NNP and HyperNP can be thought of as surrogate models for projection techniques, which attempt to alleviate some of the computational burden by approximating the results.…”
Section: Introductionmentioning
confidence: 99%
“…Some interactive DR methods create a bi-directional workflow where people can alter data in the high dimensional space to see the effect on the 2D location and vice versa [9,29]. Other works explore the idea of backwards (or inverse) projections that allow people to select locations in the 2D space and generate corresponding high-dimensional representations [13,21]. PEx-Image specifically targets image data, providing interactions for exploratory tasks, such as zooming into specific projection regions, displacing points to resolve overlapping and…”
Section: Interactive Dimensionality Reductionmentioning
confidence: 99%