2020
DOI: 10.3389/fncel.2020.00171
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional Neural Networks Can Predict Retinal Differentiation in Retinal Organoids

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
41
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 45 publications
(41 citation statements)
references
References 38 publications
0
41
0
Order By: Relevance
“…Currently, the development and differentiation process of ROs can be continuously monitored and quantitatively evaluated. Some studies have developed an algorithm based on deep learning using bright-field images, allowing researchers to predict the direction of differentiation and identify differentiation even before gene expression in the mouse–derived organoid culture ( Kegeles et al, 2020a ). When human–derived organoids develop into a lamellar structure, real-time imaging modalities such as fluorescence lifetime imaging microscopy, hyperspectral imaging, and optical coherence tomography can be used to monitor the metabolic status of ROs and even PRCs, and these methods will not damage the culture structure ( Browne et al, 2017 ).…”
Section: Development Of Retinal Organoidsmentioning
confidence: 99%
See 1 more Smart Citation
“…Currently, the development and differentiation process of ROs can be continuously monitored and quantitatively evaluated. Some studies have developed an algorithm based on deep learning using bright-field images, allowing researchers to predict the direction of differentiation and identify differentiation even before gene expression in the mouse–derived organoid culture ( Kegeles et al, 2020a ). When human–derived organoids develop into a lamellar structure, real-time imaging modalities such as fluorescence lifetime imaging microscopy, hyperspectral imaging, and optical coherence tomography can be used to monitor the metabolic status of ROs and even PRCs, and these methods will not damage the culture structure ( Browne et al, 2017 ).…”
Section: Development Of Retinal Organoidsmentioning
confidence: 99%
“…Summary of improved protocols of retinal organoids Kegeles et al, 2020a. An algorithm based on deep learning using bright-field images Predicting the direction of differentiation and identify differentiationBrowne et al, 2017 Real-time imaging modalities including fluorescence lifetime imaging microscopy, hyperspectral imaging, and optical coherence tomography Non-invasive real-time monitoring of metabolic status of ROs and even PRCs nerve aggregates will appear first, and they can differentiate into ROs after being induced by neural stem cells medium.…”
mentioning
confidence: 99%
“…Phenotypic alterations that appear during the culture process could be easily discerned using computational or bioinformatic methodology, such as machine-learning methods, and integration of omics and imaging. Studies have begun to predict the differentiation efficiency of retinal organoid cultures based on bright-field images ( Kegeles et al, 2020 ) or functionality and quality of hPSC-derived RPE from live or immunofluorescence images ( Schaub et al, 2020 ; Ye et al, 2020 ).…”
Section: Next-generation Retinal Organoidsmentioning
confidence: 99%
“…A study based on iPSC-derived cortical organoids grown over several months documented a similarity between the electrophysiological activity pattern of cortical organoids and human preterm neonatal electroencephalography using supervised machine learning [ 130 ]. In another study, iPSC-derived retinal organoids adopted a deep-learning computer algorithm based on bright-field imaging to recognize and predict retinal differentiation [ 131 ]. Particularly, they applied a transfer learning approach using convolutional neural network (CNN) algorithms that are loosely modeled on the network of a human brain agglomerated into multiple functional layers [ 87 ].…”
Section: Overcoming Challenges With Possible Technical Solutionsmentioning
confidence: 99%