2020
DOI: 10.1186/s12859-020-03715-y
|View full text |Cite
|
Sign up to set email alerts
|

MCMSeq: Bayesian hierarchical modeling of clustered and repeated measures RNA sequencing experiments

Abstract: Background As the barriers to incorporating RNA sequencing (RNA-Seq) into biomedical studies continue to decrease, the complexity and size of RNA-Seq experiments are rapidly growing. Paired, longitudinal, and other correlated designs are becoming commonplace, and these studies offer immense potential for understanding how transcriptional changes within an individual over time differ depending on treatment or environmental conditions. While several methods have been proposed for dealing with repeat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(13 citation statements)
references
References 51 publications
0
13
0
Order By: Relevance
“…The glmmTMB package utilizes the Template Model Builder automatic differentiation engine, and the INLA package uses an efficient Bayesian framework based on integrated nested Laplace approximations (LAs). We did not include methods based on adaptive Gaussian quadrature (AGQ) ( glmer.nb with nAGQ>1) 24 , 25 or Bayesian methods using Markov chain Monte Carlo (MCMC) 26 because of their computational intensity. The Gaussian variational approximation method 14 can be promising but has not been implemented for the NBMM.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The glmmTMB package utilizes the Template Model Builder automatic differentiation engine, and the INLA package uses an efficient Bayesian framework based on integrated nested Laplace approximations (LAs). We did not include methods based on adaptive Gaussian quadrature (AGQ) ( glmer.nb with nAGQ>1) 24 , 25 or Bayesian methods using Markov chain Monte Carlo (MCMC) 26 because of their computational intensity. The Gaussian variational approximation method 14 can be promising but has not been implemented for the NBMM.…”
Section: Resultsmentioning
confidence: 99%
“…Although the time complexity is under the assumption of the independent random effects, the two-layer procedure often requires hundreds of iterations to converge, which becomes the major computational bottleneck. On the other hand, Bayesian methods such as MCMC 26 offer better accuracy, but are even slower than the above methods.…”
Section: Methodsmentioning
confidence: 99%
“…The newer methods such as edgeR or voom do not provide theoretical guarantee of type-I error control either. These methods have been shown to have inflated type-I error rate when applied to other types of data (Hawinkel et al, 2019;Rocke et al, 2015;Vestal et al, 2020;Datta and Nettleton, 2014). In this paper, we propose a multivariate statistical framework, 'Compositional Data Analysis using Kernels' (CODAK), based on kernel distance covariance (KDC) (Hua and Ghosh, 2015) to quantify and test the association of predictors such as grouping or application of drugs with the composition profile of cell types.…”
Section: Statistical Challengementioning
confidence: 99%
“…baySeq [ 13 , 14 ], PairedFB [ 15 ]), or can only perform single DF tests (e.g. MCMSeq [ 16 ], ShrinkBayes [ 17 ]).…”
Section: Introductionmentioning
confidence: 99%
“…The package rmRNAseq [ 18 ] utilizes the voom normalization method on log-transformed counts and then models the transformed data using a linear model with a continuous auto-regressive structure to account for the correlation in the data. Vestal et al [ 16 ] tested a similar method by using a variance stabilizing transformation (VST) on raw RNA-seq counts and then fitting linear mixed models (LMMs) to the transformed data. They found that this method performed similarly to their hierarchical Bayesian MCMSeq method in terms of T1E and FDR control, but many models failed to converge in small sample size situations.…”
Section: Introductionmentioning
confidence: 99%