2022
DOI: 10.1371/journal.pone.0269461
|View full text |Cite
|
Sign up to set email alerts
|

Memory augmented recurrent neural networks for de-novo drug design

Abstract: A recurrent neural network (RNN) is a machine learning model that learns the relationship between elements of an input series, in addition to inferring a relationship between the data input to the model and target output. Memory augmentation allows the RNN to learn the interrelationships between elements of the input over a protracted length of the input series. Inspired by the success of stack augmented RNN (StackRNN) to generate strings for various applications, we present two memory augmented RNN-based arch… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 37 publications
0
5
0
Order By: Relevance
“…Alone, RNNs show a deficiency: their limited memory causes both computed and input data storage difficulties. Therefore, the availability of a higher quantity of memory for RNNs is crucial as it enables a stronger understanding of the relationships between input elements, as stressed by Suresh et al [39]. Furthermore, merging TM and RNNs allows for a new architecture that is memory augmented as RNNs own the equivalent infinite memory tape as TM.…”
Section: ) 1980 -2000mentioning
confidence: 99%
See 1 more Smart Citation
“…Alone, RNNs show a deficiency: their limited memory causes both computed and input data storage difficulties. Therefore, the availability of a higher quantity of memory for RNNs is crucial as it enables a stronger understanding of the relationships between input elements, as stressed by Suresh et al [39]. Furthermore, merging TM and RNNs allows for a new architecture that is memory augmented as RNNs own the equivalent infinite memory tape as TM.…”
Section: ) 1980 -2000mentioning
confidence: 99%
“…By selecting data structures that use memory efficiently, it is expected to reduce the energy required for memory operations such as reading, writing, and moving data. For example, using a linked list instead of an array can save memory Harvard architecture [26] 1946 Poor It can improve performance in certain applications, but may also require more complex programming techniques High-Performance Computing [28], [29] 1960 Poor Due to their high processing power, HPC typically consumes substantial amounts of energy Quantum computers [30] [31] 1979 Medium Radical paradigmatic change in the concept of computing, although QC presents cooling issues Recurrent Neural Networks [38], [39] 1986 Poor RNNs are energy expensive and require a higher quantity of memory TrueNorth Architecture [47] 2014 N/A Architecture closely based on the human brain's organization and ANNs Evolutions of Turing machines (such as Neural Turing Machines [41], Neural Random Access Turing Machine [46])…”
Section: ) 2000 -2020mentioning
confidence: 99%
“…RNNs are often taught to create a single character at a time, with each character being informed by the characters that came before it in the molecular string. This allows them to be used to generate new molecular strings ( Suresh et al, 2022 ). Other popular CLM architectures, as seen earlier in this review, are a) VAE, constituted by an encoder that converts molecular strings to latent vectors and a decoder that converts latent vectors back to molecular strings, and b) GANs, constituted by a generator network that produces novel molecular strings and a discriminator network aiming to distinguish between the de novo designs and existing molecules.…”
Section: Generative Ai Modelsmentioning
confidence: 99%
“…We implemented a deep recurrent neural network (RNN) 29 with long short-term memory (LSTM) cells for generation to design potential hit candidates for Mac1. RNNs have been instrumental in sequence modelling tasks, which predict or generate sequences of data (e.g., words making up sentences) where the temporal order or sequence of the data points is vital.…”
Section: Ai-guided Fragment Expansionmentioning
confidence: 99%