2018
DOI: 10.1186/s12859-018-2286-z
|View full text |Cite
|
Sign up to set email alerts
|

Dense neural networks for predicting chromatin conformation

Abstract: BackgroundDNA inside eukaryotic cells wraps around histones to form the 11nm chromatin fiber that can further fold into higher-order DNA loops, which may depend on the binding of architectural factors. Predicting how the DNA will fold given a distribution of bound factors, here viewed as a type of sequence, is currently an unsolved problem and several heterogeneous polymer models have shown that many features of the measured structure can be reproduced from simulations. However a model that determines the opti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
4
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 19 publications
(9 citation statements)
references
References 51 publications
0
9
0
Order By: Relevance
“…A predictor of chromatin state representative sequences. A DNN model was built with one convolution filter (forward model) to predict a 1D chromatin state sequence representation of the chromatin structure in Drosophila [ 50 ]. This model consisted of one convolution layer and a fully connected neural network and was trained to predict a \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$w \times w$\end{document} Hi-C contact matrix using a genomic sequence of length \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$3w$\end{document} and its overlapping peaks with \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$M$\end{document} chromatin factors.…”
Section: Deep Learning Model Interpretation In Bioinformaticsmentioning
confidence: 99%
“…A predictor of chromatin state representative sequences. A DNN model was built with one convolution filter (forward model) to predict a 1D chromatin state sequence representation of the chromatin structure in Drosophila [ 50 ]. This model consisted of one convolution layer and a fully connected neural network and was trained to predict a \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$w \times w$\end{document} Hi-C contact matrix using a genomic sequence of length \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$3w$\end{document} and its overlapping peaks with \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{upgreek} \usepackage{mathrsfs} \setlength{\oddsidemargin}{-69pt} \begin{document} }{}$M$\end{document} chromatin factors.…”
Section: Deep Learning Model Interpretation In Bioinformaticsmentioning
confidence: 99%
“…Neural networks of multiple layers of decision-making nodes are able to learn features in a dataset and have been used to infer gene expressions, and similar techniques have been applied to reduce noise in ChIP-seq data [77]. The ability to predict gene expressions [78], and epigenetic features such as chromatin structures [79], and enhancer sites [80] theoretically make deep learning also applicable in identifying peaks in Chem-seq data, considering the presence of high-background noise and nonuniform read distributions. Just as sequencing data can be thought of as images, they can be treated as inputs for convolutional neural networks and be processed similarly.…”
Section: Perspectivesmentioning
confidence: 99%
“…The machine learning approaches used in these works include generalized linear models (Ibn-Salem & Andrade-Navarro, 2019), random forest (Bkhetan & Plewczynski, 2018;, other ensemble models (Whalen, Truty & Pollard, 2016), and neural networks: multi-layer perceptron , dense neural networks (Zeng, Wu & Jiang, 2018;Farré et al, 2018;Li, Wong & Jiang, 2019), convolutional neural networks (Schreiber et al, 2017), generative adversarial networks (Liu, Lv & Jiang, 2019), and recurrent neural networks (Cristescu et al, 2018;Singh et al, 2019;Gan, Li & Jiang, 2019).…”
Section: Introductionmentioning
confidence: 99%