2022
DOI: 10.1109/tpami.2021.3116668
|View full text |Cite
|
Sign up to set email alerts
|

Deep Generative Modelling: A Comparative Review of VAEs, GANs, Normalizing Flows, Energy-Based and Autoregressive Models

Abstract: Deep generative models are a class of techniques that train deep neural networks to model the distribution of training samples. Research has fragmented into various interconnected approaches, each of which make trade-offs including run-time, diversity, and architectural restrictions. In particular, this compendium covers energy-based models, variational autoencoders, generative adversarial networks, autoregressive models, normalizing flows, in addition to numerous hybrid approaches. These techniques are compar… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
102
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 301 publications
(133 citation statements)
references
References 70 publications
0
102
0
Order By: Relevance
“…We identified challenges with using normalizing flows for distribution learning, namely, the need for deep networks with many bijective transformations to adequately learn a mapping to the target distribution. For this reason, it is prohibitive from a computational cost standpoint for current flow-based models to be competitive with other architectures on distribution learning tasks [14].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…We identified challenges with using normalizing flows for distribution learning, namely, the need for deep networks with many bijective transformations to adequately learn a mapping to the target distribution. For this reason, it is prohibitive from a computational cost standpoint for current flow-based models to be competitive with other architectures on distribution learning tasks [14].…”
Section: Discussionmentioning
confidence: 99%
“…Previous work applying NFs to molecule generation [5] has shown that NFs with post-hoc corrections to enforce chemical validity achieve high validity, novelty, and uniqueness scores on benchmark datasets like QM9 [9] and ZINC250K [10]. However, NFs require deep architectures with many bijective transformations to model complex target distributions [11,12,13], which results in a prohibitive computational cost to training flows [14]. Some applications of generative modeling may call for learning immense, heterogeneous regions of chemical space, which deep DGMs seem well-suited for.…”
Section: Introductionmentioning
confidence: 99%
“…Although they can be powerful estimators for the distribution of interest, the training and generation can be exceedingly slow as they are done in a sequential manner ( Table 2 ). 343 A comprehensive benchmark over three different AR-based approaches to mAb generation, conditioned on protein structure can be found in the work of Melnyk et al where “causal convolutions”, GNNs, and transformer-based generation methods are compared. The authors also provide guidelines as to which of the three AR-based approaches are best suited to specific antibody design tasks (e.g., CDR3 grafting, broad or narrow sequence diversity generation).…”
Section: Unconstrained Parameter-driven In Silico Antibody Sequence S...mentioning
confidence: 99%
“…Common latent variable models in deep learning include energy-based models, variational autoencoders, and flow models [ 28 , 29 ]. The VAE explicitly models the density of the distribution, so it has a prescribed Bayesian Network.…”
Section: Introductionmentioning
confidence: 99%