2019
DOI: 10.26434/chemrxiv.8058464
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Transformer Model for Retrosynthesis

Abstract: <div><div><div><p>We describe a Transformer model for a retrosynthetic reaction prediction task. The model is trained on 45 033 experimental reaction examples extracted from USA patents. It can successfully predict the reactants set for 42.7% of cases on the external test set. During the training procedure, we applied different learning rate schedules and snapshot learning. These techniques can prevent overfitting and thus can be a reason to get rid of internal validation dataset that i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
29
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(30 citation statements)
references
References 0 publications
1
29
0
Order By: Relevance
“…et al, 2017] and LSTM+Attention [Liu et al, 2017]. Note that the results of vanilla Transformer are based on our own experiments since the results reported by previous works [Zheng et al, 2019;Lin et al, 2019;Karpov et al, 2019;Lee et al, 2019] are different from each other. Actually, their results are closed to the results of our implementation.…”
Section: Settingsmentioning
confidence: 82%
See 3 more Smart Citations
“…et al, 2017] and LSTM+Attention [Liu et al, 2017]. Note that the results of vanilla Transformer are based on our own experiments since the results reported by previous works [Zheng et al, 2019;Lin et al, 2019;Karpov et al, 2019;Lee et al, 2019] are different from each other. Actually, their results are closed to the results of our implementation.…”
Section: Settingsmentioning
confidence: 82%
“…Moreover, without the constraint of fixed templates, they have the potential of discovering novel synthetic routes. The most related work to ours are [Zheng et al, 2019] [Lin et al, 2019Karpov et al, 2019;Lee et al, 2019] that apply Transformer to retrosynthesis prediction.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…The method has benefited from the recent progress in the neural machine translation field where the Transformer architecture demonstrated state-of-the-art results [34]. Recently the Transformer also exhibited very promising results in predicting the products of chemical reaction and retrosynthesis [56,57]. One of the key features of the Transformer is self-attention layers.…”
Section: The Transformer Applicability To Drug Generation Taskmentioning
confidence: 99%