2022
DOI: 10.1101/2022.04.26.489548
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Approximating Solutions of the Chemical Master Equation using Neural Networks

Abstract: The Chemical Master Equation (CME) provides an accurate description of stochastic biochemical reaction networks in well-mixed conditions, but it cannot be solved analytically for most systems of practical interest. While Monte Carlo methods provide a principled means to probe the system dynamics, their high computational cost can render the estimation of molecule number distributions and other numerical tasks infeasible due to the large number of repeated simulations typically required. In this paper we aim to… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 71 publications
0
9
0
Order By: Relevance
“…To benchmark the algorithm’s performance, we investigated the precision achieved in recovering PMFs on a grid, as well as the runtime, for a series of testing rate vectors (Figure 2a). To quantify precision, we calculate the Hellinger distance [30], a metric of discrepancy between probability distributions: where ß N and ß M are the state space bounds for each θ . Although the system in Figure 1a does not afford a ground truth, we can approximate it by a high-quality generating function approximation.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…To benchmark the algorithm’s performance, we investigated the precision achieved in recovering PMFs on a grid, as well as the runtime, for a series of testing rate vectors (Figure 2a). To quantify precision, we calculate the Hellinger distance [30], a metric of discrepancy between probability distributions: where ß N and ß M are the state space bounds for each θ . Although the system in Figure 1a does not afford a ground truth, we can approximate it by a high-quality generating function approximation.…”
Section: Resultsmentioning
confidence: 99%
“…To implement this approximation, we need to find values of w k , , and . In principle, this can be accomplished by training a neural network to predict these quantities based on θ ; this is the approach taken in the Nessie framework [30]. However, we find that and .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To benchmark the algorithm's performance, we investigated the precision achieved in recovering PMFs on a grid, as well as the runtime, for a series of testing rate vectors (Figure 2a). To quantify precision, we calculate the Hellinger distance [30], a metric of discrepancy between probability distributions:…”
Section: P (M|n; ✓) and The Kernel Approximationsmentioning
confidence: 99%
“…Most pertinently, [30] have independently proposed a method for the approximation of CME solutions by a set of negative binomial mixture functions, implemented using the neural Nessie framework. However, the approach is qualitatively di↵erent to the one we present.…”
Section: Introductionmentioning
confidence: 99%