2022
DOI: 10.48550/arxiv.2207.09542
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Controllable Data Generation by Deep Learning: A Review

Abstract: Designing and generating new data under targeted properties has been attracting various critical applications such as molecule design, image editing and speech synthesis. Traditional hand-crafted approaches heavily rely on expertise experience and intensive human efforts, yet still suffer from the insufficiency of scientific knowledge and low throughput to support effective and efficient data generation. Recently, the advancement of deep learning induces expressive methods that can learn the underlying represe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 200 publications
0
2
0
Order By: Relevance
“…By reconstructing data from low-dimensional embeddings, the nature of the majority pattern in the data is captured while outliers and noise are filtered out. Reconstruction error, which is defined as the degree of mismatch between the original data and the reconstructed data from lowdimensional embeddings, can be considered a strong indicator for describing anomalies in the dataset [23,[34][35][36][37]. Specifically, larger reconstruction errors indicate a higher probability of anomalies, as they deviate significantly from the majority patterns.…”
Section: Candidate Anomaly Subgraphs Extraction By Location-aware Gra...mentioning
confidence: 99%
“…By reconstructing data from low-dimensional embeddings, the nature of the majority pattern in the data is captured while outliers and noise are filtered out. Reconstruction error, which is defined as the degree of mismatch between the original data and the reconstructed data from lowdimensional embeddings, can be considered a strong indicator for describing anomalies in the dataset [23,[34][35][36][37]. Specifically, larger reconstruction errors indicate a higher probability of anomalies, as they deviate significantly from the majority patterns.…”
Section: Candidate Anomaly Subgraphs Extraction By Location-aware Gra...mentioning
confidence: 99%
“…Extensive efforts have been spent on learning underlying low-dimensional representation and the generation process of high-dimensional data through deep generative models such as variational autoencoders (VAE) [27,35,9], generative adversarial networks (GANs) [11,12], normalizing flows [40,5], etc [48,17,8]. Particularly, enhancing the disentanglement and independence of latent dimensions has been attracting the attention of the community [4,43,3,34,45,23], enabling controllable generation that generates data with desired properties by interpolating latent variables [44,13,29,25,38,20,7,49]. For instance, CSVAE transfers image attributes by correlating latent variables with desired properties [28].…”
Section: Introductionmentioning
confidence: 99%