2016
DOI: 10.1371/journal.pcbi.1005234
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Regression Based Structure Learning of Stochastic Reaction Networks from Single Cell Snapshot Time Series

Abstract: Stochastic chemical reaction networks constitute a model class to quantitatively describe dynamics and cell-to-cell variability in biological systems. The topology of these networks typically is only partially characterized due to experimental limitations. Current approaches for refining network topology are based on the explicit enumeration of alternative topologies and are therefore restricted to small problem instances with almost complete knowledge. We propose the reactionet lasso, a computational procedur… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
27
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 23 publications
(27 citation statements)
references
References 41 publications
(60 reference statements)
0
27
0
Order By: Relevance
“…These hypotheses can be falsified using model selection criteria such as the Akaike Information Criterion (AIC) [30], or Bayesian Information Criterion (BIC) [31]. For large models, a high number of mutually non-exclusive hypotheses is not uncommon and typically leads to a combinatorial explosion of the number of model candidates (see, e.g., [32,33,34]). Computing the AIC or BIC for all model candidates for comparison would require parameter inference for each model candidate and may seem futile, given that parameter inference for a single model can already be challenging.…”
Section: Introductionmentioning
confidence: 99%
“…These hypotheses can be falsified using model selection criteria such as the Akaike Information Criterion (AIC) [30], or Bayesian Information Criterion (BIC) [31]. For large models, a high number of mutually non-exclusive hypotheses is not uncommon and typically leads to a combinatorial explosion of the number of model candidates (see, e.g., [32,33,34]). Computing the AIC or BIC for all model candidates for comparison would require parameter inference for each model candidate and may seem futile, given that parameter inference for a single model can already be challenging.…”
Section: Introductionmentioning
confidence: 99%
“…We now summarize some asymptotic properties of the function G ǫ in (32). Given ǫ > 0, recall that G ǫ (x) = ǫ ln 1 + e x/ǫ , for all x ∈ R , whose first and second derivatives are…”
mentioning
confidence: 99%
“…High-dimensional 2 single-cell molecularly resolved time series data is becoming a key data source for this task [1,2]. However, 3 these technologies are destructive, and consequently result in snapshot time series data originating from 4 batches of cells collected at time points of interest. A longstanding and still challenging problem is to 5 reconstruct dynamic biological processes from this data, to the end of identifying dynamically important 6 states, i.e.…”
mentioning
confidence: 99%
“…However, by 16 means of high-dimensional measurements, we typically observe larger systems comprising at least dozens of 17 components with largely a priori undefined interactions. This situation results in a combinatorial explosion 18 of model variants that cannot be exhaustively evaluated [4,5]. Alternative approaches are agnostic with 19 regards to parametric form and model structure and use a probabilistically-motivated rule to map between 20 distributions, e.g., one optimal transport method maps neighbours at one time point to the nearest neighbour 21 at the next time point [6].…”
mentioning
confidence: 99%
See 1 more Smart Citation