2021
DOI: 10.48550/arxiv.2112.01625
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Sample-Efficient Generation of Novel Photo-acid Generator Molecules using a Deep Generative Model

Abstract: Photo-acid generators (PAGs) are compounds that release acids (H + ions) when exposed to light. These compounds are critical components of the photolithography processes that are used in the manufacture of semiconductor logic and memory chips. The exponential increase in the demand for semiconductors has highlighted the need for discovering novel photo-acid generators. While de novo molecule design using deep generative models has been widely employed for drug discovery and material design, its application to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…While both can be used as generative methods, genetic algorithms, and generative models differ in the way they utilize data: genetic algorithms implicitly learn the landscape of the design space as they iteratively pursue global optimization, while generative methods learn the whole distribution of the data set. Sequence-based VAEs are the most commonly used generative method in molecular design, often using SMILES strings or similar molecular representations as input, but other deep generative models have also been proposed. , VAEs have been used to generate novel environmentally friendly photoacid generators for semiconductor photolithography and metal–organic framework structures for carbon dioxide capture . Attention mechanisms, along with self-attentive transformer models, shows tremendous potential in exploring complex and novel chemistries, as well as balancing learning of high-level structural and atomic relationships with latent space complexity. , For example, applied molecular design could educate thermoset polymer designs with reversible cross-links, enabling reuse.…”
Section: Computational Methods For Materiomics and Sustainable Materi...mentioning
confidence: 99%
See 1 more Smart Citation
“…While both can be used as generative methods, genetic algorithms, and generative models differ in the way they utilize data: genetic algorithms implicitly learn the landscape of the design space as they iteratively pursue global optimization, while generative methods learn the whole distribution of the data set. Sequence-based VAEs are the most commonly used generative method in molecular design, often using SMILES strings or similar molecular representations as input, but other deep generative models have also been proposed. , VAEs have been used to generate novel environmentally friendly photoacid generators for semiconductor photolithography and metal–organic framework structures for carbon dioxide capture . Attention mechanisms, along with self-attentive transformer models, shows tremendous potential in exploring complex and novel chemistries, as well as balancing learning of high-level structural and atomic relationships with latent space complexity. , For example, applied molecular design could educate thermoset polymer designs with reversible cross-links, enabling reuse.…”
Section: Computational Methods For Materiomics and Sustainable Materi...mentioning
confidence: 99%
“…Sequence-based VAEs are the most commonly used 115−118 generative method in molecular design, often using SMILES strings or similar molecular representations as input, but other deep generative models have also been proposed. 119,120 VAEs have been used to generate novel environmentally friendly photoacid generators for semiconductor photolithography 116 and metal−organic framework structures for carbon dioxide capture. 118 Attention mechanisms, along with self-attentive transformer models, 121 shows tremendous potential in exploring complex and novel chemistries, as well as balancing learning of high-level structural and atomic relationships with latent space complexity.…”
Section: Multiscale Machine Learning Designmentioning
confidence: 99%