2023
DOI: 10.1371/journal.pcbi.1011256
|View full text |Cite
|
Sign up to set email alerts
|

Monte Carlo samplers for efficient network inference

Abstract: Accessing information on an underlying network driving a biological process often involves interrupting the process and collecting snapshot data. When snapshot data are stochastic, the data’s structure necessitates a probabilistic description to infer underlying reaction networks. As an example, we may imagine wanting to learn gene state networks from the type of data collected in single molecule RNA fluorescence in situ hybridization (RNA-FISH). In the networks we consider, nodes represent network states, and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 79 publications
0
3
0
Order By: Relevance
“…We found that the BI methods were the most computationally expensive but had a comparable efficiency (number of likelihood calculations per second) as MLE methods such as differential evolution (see Figure SI 19), while generating full posterior distributions rather than point estimates. On the whole, our data on what might be considered a simple model with synthetic data clearly demonstrate the technical challenges of BI , and highlight the need for careful evaluation of MCMC sampling and MLE optimization.…”
Section: Discussionmentioning
confidence: 77%
“…We found that the BI methods were the most computationally expensive but had a comparable efficiency (number of likelihood calculations per second) as MLE methods such as differential evolution (see Figure SI 19), while generating full posterior distributions rather than point estimates. On the whole, our data on what might be considered a simple model with synthetic data clearly demonstrate the technical challenges of BI , and highlight the need for careful evaluation of MCMC sampling and MLE optimization.…”
Section: Discussionmentioning
confidence: 77%
“…While we only discuss a few selected algorithms and their trade-offs, it is important to note that there are hybrid approaches that combine multiple methods. 32 The random walk Metropolis−Hastings (MH) algorithm 20 is a well established, basic method for Markov chain Monte Carlo sampling. Briefly, for each iteration, a random jump from the current parameter set to a new parameter set is performed using a proposal (jump) distribution.…”
Section: ■ Monte Carlo In Bayesian Inferencementioning
confidence: 99%
“…Fixed-temperature algorithms are stand-alone sampling procedures that can be integrated into a larger multi-temperature sampler. While we only discuss a few selected algorithms and their trade-offs, it is important to note that there are hybrid approaches that combine multiple methods …”
Section: Monte Carlo In Bayesian Inferencementioning
confidence: 99%