2019
DOI: 10.48550/arxiv.1905.11600
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

GraphNVP: An Invertible Flow Model for Generating Molecular Graphs

Abstract: We propose GraphNVP, the first invertible, normalizing flow-based molecular graph generation model. We decompose the generation of a graph into two steps: generation of (i) an adjacency tensor and (ii) node attributes. This decomposition yields the exact likelihood maximization on graph-structured data, combined with two novel reversible flows. We empirically demonstrate that our model efficiently generates valid molecular graphs with almost no duplicated molecules. In addition, we observe that the learned lat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
111
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 65 publications
(111 citation statements)
references
References 10 publications
0
111
0
Order By: Relevance
“…Deep generative models model a distribution of general molecular structure with a deep network model so that one can generate molecules by sampling from the learned distribution. Typical algorithms include variational autoencoder (VAE), generative adversarial network (GAN), energy-based models, flow-based model [14,21,9,37,23,17,34,38,22,49,27,5,32,1]. Also, DGMs can leverage Bayesian optimization in latent spaces to optimize latent vectors and reconstruct to obtain the optimized molecules [21].…”
Section: Related Workmentioning
confidence: 99%
“…Deep generative models model a distribution of general molecular structure with a deep network model so that one can generate molecules by sampling from the learned distribution. Typical algorithms include variational autoencoder (VAE), generative adversarial network (GAN), energy-based models, flow-based model [14,21,9,37,23,17,34,38,22,49,27,5,32,1]. Also, DGMs can leverage Bayesian optimization in latent spaces to optimize latent vectors and reconstruct to obtain the optimized molecules [21].…”
Section: Related Workmentioning
confidence: 99%
“…InfoMin employs a flow-based generative model as the view generator for data augmentation and trains the view generator in a semi-supervised manner. However, transferring this idea to graph C: is a nontrivial task since current graph generative models are either of limited generation qualities [19] or designed for specific tasks such as the molecular data [6,25]. To overcome this issue, in this work we build a learnable graph view generator that learns a probability distribution over the nodelevel augmentations.…”
Section: Learnable Data Augmentationmentioning
confidence: 99%
“…Most existing non-autoregressive models are permutation invarant. Various classes of model have been proposed including VAE- [24,3], Flows- [11,14], and Score-based [16] models. They all use permutation equivariant generated functions as detailed in section 3.1.…”
Section: Related Workmentioning
confidence: 99%