2023
DOI: 10.1038/s41598-023-43046-5
|View full text |Cite
|
Sign up to set email alerts
|

Meta-learning for transformer-based prediction of potent compounds

Hengwei Chen,
Jürgen Bajorath

Abstract: For many machine learning applications in drug discovery, only limited amounts of training data are available. This typically applies to compound design and activity prediction and often restricts machine learning, especially deep learning. For low-data applications, specialized learning strategies can be considered to limit required training data. Among these is meta-learning that attempts to enable learning in low-data regimes by combining outputs of different models and utilizing meta-data from these predic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 37 publications
0
7
0
Order By: Relevance
“…Different from chemical structure conversion via generative modeling, the inclusion of molecular property constraints in compound design requires the implementation of conditional transformers models. For example, such transformers were derived to predict activity cliffs [32] or individual highly potent compounds [33].…”
Section: Exemplary Chemical Language Modelsmentioning
confidence: 99%
See 4 more Smart Citations
“…Different from chemical structure conversion via generative modeling, the inclusion of molecular property constraints in compound design requires the implementation of conditional transformers models. For example, such transformers were derived to predict activity cliffs [32] or individual highly potent compounds [33].…”
Section: Exemplary Chemical Language Modelsmentioning
confidence: 99%
“…Following this approach, the DeepAC model predicted activity cliffs with an accuracy at least comparable to top‐performing machine learning models and further extended such predictions through generative design of new activity cliff compounds [32]. Furthermore, compound predictions conditioned on large potency differences were generalized beyond activity cliffs by adjusting the training protocol to predict highly potent compounds from weakly potent input templates [33]. The transformer CLM was shown to successfully predict known highly potent compounds from different activity classes not encountered during training.…”
Section: Exemplary Chemical Language Modelsmentioning
confidence: 99%
See 3 more Smart Citations