2019
DOI: 10.1021/acscentsci.9b00576
|View full text |Cite
|
Sign up to set email alerts
|

Molecular Transformer: A Model for Uncertainty-Calibrated Chemical Reaction Prediction

Abstract: Organic synthesis is one of the key stumbling blocks in medicinal chemistry. A necessary yet unsolved step in planning synthesis is solving the forward problem: Given reactants and reagents, predict the products. Similar to other work, we treat reaction prediction as a machine translation problem between simplified molecular-input line-entry system (SMILES) strings (a text-based representation) of reactants, reagents, and the products. We show that a multihead attention Molecular Transformer model outperforms … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

8
790
1
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 672 publications
(801 citation statements)
references
References 55 publications
8
790
1
2
Order By: Relevance
“…Knowledge graph generation, simplification, and comparison technologies, as well as powerful graph inference methods, allow unprecedented fidelity in knowledge representation [64], and simplify how experts can consume the large volumes extracted. Moreover, machine learning-based surrogate models for physical systems [65] achieve more than 90% accuracy in solving complex chemical problems, assisting expert chemists in designing complex synthesis pathways. The combination of these technologies, based on "bits + neurons," has led to the development of new tools that permit domain experts to focus more upon outcomes, with the ultimate goal of speeding up groundbreaking new science and reducing the time-tomarket for innovative materials.…”
Section: Integration Of Bits + Neurons + Qubitsmentioning
confidence: 99%
“…Knowledge graph generation, simplification, and comparison technologies, as well as powerful graph inference methods, allow unprecedented fidelity in knowledge representation [64], and simplify how experts can consume the large volumes extracted. Moreover, machine learning-based surrogate models for physical systems [65] achieve more than 90% accuracy in solving complex chemical problems, assisting expert chemists in designing complex synthesis pathways. The combination of these technologies, based on "bits + neurons," has led to the development of new tools that permit domain experts to focus more upon outcomes, with the ultimate goal of speeding up groundbreaking new science and reducing the time-tomarket for innovative materials.…”
Section: Integration Of Bits + Neurons + Qubitsmentioning
confidence: 99%
“…The method has benefited from the recent progress in the neural machine translation field where the Transformer architecture demonstrated state-of-the-art results [34]. Recently the Transformer also exhibited very promising results in predicting the products of chemical reaction and retrosynthesis [56,57]. One of the key features of the Transformer is self-attention layers.…”
Section: The Transformer Applicability To Drug Generation Taskmentioning
confidence: 99%
“…Concurrently to rule-based systems, a wide range of AI approaches have been reported for retrosynthetic analysis [9,12], prediction of reaction outcomes [21][22][23][24][25][26] and optimization of reaction conditions [27]. All these AI models superseded rule-based methods in their potential of mimicking the human brain by learning chemistry from large data sets without human intervention.…”
Section: Introductionmentioning
confidence: 99%
“…Among the different AI approaches [39] those treating chemical reaction prediction as natural language (NL) problems [40] are becoming increasingly popular. They are currently state of the art in the forward reaction prediction realm, scoring an undefeated accuracy of more than 90% [22]. In the NL framework, chemical reactions are encoded as sentences using reaction SMILES [41] and the forward-or retro-reaction prediction is cast as a translation problem, using different types of neural machine translation architectures.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation