2017
DOI: 10.26434/chemrxiv.5309668.v3
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Optimizing distributions over molecular space. An Objective-Reinforced Generative Adversarial Network for Inverse-design Chemistry (ORGANIC)

Abstract: Molecular discovery seeks to generate chemical species tailored to very specific needs. In this paper, we present ORGANIC, a framework based on Objective-Reinforced Generative Adversarial Networks (ORGAN), capable of producing a distribution over molecular space that matches with a certain set of desirable metrics. This methodology combines two successful techniques from the machine learning community: a Generative Adversarial Network (GAN), to create non-repetitive sensible molecular species, and Reinforcemen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
239
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 212 publications
(240 citation statements)
references
References 12 publications
1
239
0
Order By: Relevance
“…While originally developed for applications such as speech and image recognition, there is now an intense interest in machine learning models, which are now combined with methods from Adv. [192,[196][197][198] However, the ultimate impact of these approaches on the development of new materials and devices has yet to be evaluated. 2019, 31, 1808256 Figure 6.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…While originally developed for applications such as speech and image recognition, there is now an intense interest in machine learning models, which are now combined with methods from Adv. [192,[196][197][198] However, the ultimate impact of these approaches on the development of new materials and devices has yet to be evaluated. 2019, 31, 1808256 Figure 6.…”
Section: Discussionmentioning
confidence: 99%
“…[189][190][191][192][193] Examples include the use of regression and classification models such as neural networks for the prediction of molecular or materials properties [194,195] and for synthesis planning [146,149,150] as well as the use of generative models such as variational autoencoders and generative adversarial networks for inverse molecular design. [192,[196][197][198] However, the ultimate impact of these approaches on the development of new materials and devices has yet to be evaluated.…”
Section: Discussionmentioning
confidence: 99%
“…In particular, to broaden the search space, ML methods that use probabilistic language models based on deep neural networks (DNNs) have proliferated intensively since 2017. Promising examples have included various types of varia-tional autoencoders, [12][13][14][15] generative adversarial networks, [16] recurrent neural networks, [17,18] and so on. [11] Models trained to recognize chemically realistic structures are then used to refine chemical strings in the molecular design calculation.…”
Section: Introductionmentioning
confidence: 99%
“…[11] Models trained to recognize chemically realistic structures are then used to refine chemical strings in the molecular design calculation. Promising examples have included various types of varia-tional autoencoders, [12][13][14][15] generative adversarial networks, [16] recurrent neural networks, [17,18] and so on. These methods have been able to produce diverse chemical structures; however, they often require large training datasets to obtain a DNN-based generator that can produce chemically realistic molecules with grammatically valid SMILES.…”
Section: Introductionmentioning
confidence: 99%
“…Their goal is to generate new lead compounds in silico, such that their medical and chemical properties are predicted in advance. Examples of this approach include Variational Auto-Encoders [2], Adversarial Auto-Encoders [3,4], Recurrent Neural Networks and Reinforcement Learning [5,6,7], eventually in combination with Sequential Generative Adversarial Networks [8,9].…”
Section: Introductionmentioning
confidence: 99%