2022
DOI: 10.3390/e24020141
|View full text |Cite
|
Sign up to set email alerts
|

Prediction of Time Series Gene Expression and Structural Analysis of Gene Regulatory Networks Using Recurrent Neural Networks

Abstract: Methods for time series prediction and classification of gene regulatory networks (GRNs) from gene expression data have been treated separately so far. The recent emergence of attention-based recurrent neural network (RNN) models boosted the interpretability of RNN parameters, making them appealing for the understanding of gene interactions. In this work, we generated synthetic time series gene expression data from a range of archetypal GRNs and we relied on a dual attention RNN to predict the gene temporal dy… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(17 citation statements)
references
References 70 publications
0
17
0
Order By: Relevance
“…As discussed earlier, these can be placed into two groups based on the input data. The "one-step" methods estimate dynamics by directly using expression trajectories; these include RNAForecaster [6] (which is an out-of-the-box NeuralODE), and PRESCIENT [4], among others [14,15]. PHOENIX is more similar to these methods.…”
Section: Phoenix Exceeds the Most Optimistic Performances Of Current ...mentioning
confidence: 99%
See 1 more Smart Citation
“…As discussed earlier, these can be placed into two groups based on the input data. The "one-step" methods estimate dynamics by directly using expression trajectories; these include RNAForecaster [6] (which is an out-of-the-box NeuralODE), and PRESCIENT [4], among others [14,15]. PHOENIX is more similar to these methods.…”
Section: Phoenix Exceeds the Most Optimistic Performances Of Current ...mentioning
confidence: 99%
“…. ; x t T } without additional steps or consideration of other regulatory inputs [4,6,15]. In the process of learning transitions between PHOENIX 3 consecutive time points, these "one-step" methods implicitly learn the local derivative (often referred to as "RNA velocity" [16]) dx dt | x=xt m , as an intermediary to estimating f .…”
Section: Introductionmentioning
confidence: 99%
“…2) Stationary and Instant Recurrent Network: Although the windowed attention can reduce the complexity to O(L), the information utilization could be sacrificed for LTTF due to point-wise sparse connections. RNNs have achieved big successes in many sequential data applications [41]- [44] attributed to their capabilities of capturing dynamics in sequences via cycles in the network of nodes. To enhance information utilization without increasing time and memory complexities, we, therefore, renovate the recurrent network accordingly.…”
Section: A Input Representationmentioning
confidence: 99%
“…To model the dynamics of gene expression, we used the Gillespie algorithm, 30 a well-established stochastic simulation method of choice in computational biology. 31 To ensure a comprehensive exploration of system dynamics, we implemented a parallelized approach, allowing for an in-depth examination of interactions among molecular components and their consequent impact on gene expression dynamics over a defined time horizon. Simulations were carried out with the same concentrations of trigger RNA and the DNA template used in the cell-free reactions.…”
Section: 4mentioning
confidence: 99%