2020
DOI: 10.1145/3386569.3392409
|View full text |Cite
|
Sign up to set email alerts
|

Human-in-the-loop differential subspace search in high-dimensional latent space

Abstract: Generative models based on deep neural networks often have a high-dimensional latent space, ranging sometimes to a few hundred dimensions or even higher, which typically makes them hard for a user to explore directly. We propose differential subspace search to allow efficient iterative user exploration in such a space, without relying on domain- or data-specific assumptions. We develop a general framework to extract low-dimensional subspaces based on a local differential analysis of the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 30 publications
(21 citation statements)
references
References 19 publications
0
21
0
Order By: Relevance
“…Its decoder, that generates a geometry given a particular point of the latent space, is an example of parametric shape. Latent spaces often have more dimensions than a human designer can handle though, so [Chiu et al 2020] and [Abdrashitov et al 2020] try to reduce these dimensions, bringing it even closer to our conception of hyperparameter.…”
Section: Related Workmentioning
confidence: 99%
“…Its decoder, that generates a geometry given a particular point of the latent space, is an example of parametric shape. Latent spaces often have more dimensions than a human designer can handle though, so [Chiu et al 2020] and [Abdrashitov et al 2020] try to reduce these dimensions, bringing it even closer to our conception of hyperparameter.…”
Section: Related Workmentioning
confidence: 99%
“…Schrum et al [30] proposed an interface consisting of interactive evolutionary search and direct manipulation of latent variables for the game level design. Chiu et al [5] proposed a method to efficiently explore latent space in a human-in-the-loop fashion using a learned generative model, and validated it in the tasks of generating images, sounds, and 3D models.…”
Section: Latent Space Exploitationmentioning
confidence: 99%
“…This qualitatively indicates that the gradient shown in Eqs. ( 5), (11) was properly estimated. Fig.…”
Section: Transition Of Generated Data During Trainingmentioning
confidence: 99%
“…Namely, we train the HumanGAN's generator conditioned on the desired class label, and it represents the class-specific human-acceptable distribution as a result. This will contribute establishing a DNN-based framework to model the task-oriented perception by humans [11,12].…”
mentioning
confidence: 99%