We present a portable multi-contrast microscope capable of producing bright-field, dark-field, and differential phase contrast images of thin biological specimens on a smartphone platform. The microscopy method is based on an imaging scheme termed “color-coded light-emitting-diode (LED) microscopy (cLEDscope),” in which a specimen is illuminated with a color-coded LED array and light transmitted through the specimen is recorded by a color image sensor. Decomposition of the image into red, green, and blue colors and subsequent computation enable multi-contrast imaging in a single shot. In order to transform a smartphone into a multi-contrast imaging device, we developed an add-on module composed of a patterned color micro-LED array, specimen stage, and miniature objective. Simple installation of this module onto a smartphone enables multi-contrast imaging of transparent specimens. In addition, an Android-based app was implemented to acquire an image, perform the associated computation, and display the multi-contrast images in real time. Herein, the details of our smartphone module and experimental demonstrations with various biological specimens are presented.
We present a multi-contrast microscope based on color-coded illumination and computation. A programmable three-color light-emitting diode (LED) array illuminates a specimen, in which each color corresponds to a different illumination angle. A single color image sensor records light transmitted through the specimen, and images at each color channel are then separated and utilized to obtain bright-field, dark-field, and differential phase contrast (DPC) images simultaneously. Quantitative phase imaging is also achieved based on DPC images acquired with two different LED illumination patterns. The multi-contrast and quantitative phase imaging capabilities of our method are demonstrated by presenting images of various transparent biological samples.
We present a machine learning (ML) pipeline to identify star clusters in the multicolor images of nearby galaxies, from observations obtained with the Hubble Space Telescope as part of the Treasury Project LEGUS (Legacy ExtraGalactic Ultraviolet Survey). StarcNet (STAR Cluster classification NETwork) is a multiscale convolutional neural network (CNN) that achieves an accuracy of 68.6% (four classes)/86.0% (two classes: cluster/noncluster) for star cluster classification in the images of the LEGUS galaxies, nearly matching human expert performance. We test the performance of StarcNet by applying a pre-trained CNN model to galaxies not included in the training set, finding accuracies similar to the reference one. We test the effect of StarcNet predictions on the inferred cluster properties by comparing multicolor luminosity functions and mass–age plots from catalogs produced by StarcNet and by human labeling; distributions in luminosity, color, and physical characteristics of star clusters are similar for the human and ML classified samples. There are two advantages to the ML approach: (1) reproducibility of the classifications: the ML algorithm’s biases are fixed and can be measured for subsequent analysis; and (2) speed of classification: the algorithm requires minutes for tasks that humans require weeks to months to perform. By achieving comparable accuracy to human classifiers, StarcNet will enable extending classifications to a larger number of candidate samples than currently available, thus increasing significantly the statistics for cluster studies.
We demonstrate single-shot quantitative phase imaging (QPI) in a platform of color-coded LED microscopy (cLEDscope). The light source in a conventional microscope is replaced by a circular LED pattern that is trisected into subregions with equal area, assigned to red, green, and blue colors. Image acquisition with a color image sensor and subsequent computation based on weak object transfer functions allow for the QPI of a transparent specimen. We also provide a correction method for color-leakage, which may be encountered in implementing our method with consumer-grade LEDs and image sensors. Most commercially available LEDs and image sensors do not provide spectrally isolated emissions and pixel responses, generating significant error in phase estimation in our method. We describe the correction scheme for this color-leakage issue, and demonstrate improved phase measurement accuracy. The computational model and single-exposure QPI capability of our method are presented by showing images of calibrated phase samples and cellular specimens.
Satellite-based vegetation indices are an essential element in understanding the Earth's surface. In this study, we estimated the normalized difference vegetation index (NDVI) using Himawari-8/Advanced Himawari Imager (AHI) data and analyzed the sensitivity of products to atmospheric and surface correction. We used the Second Simulation of a Satellite Signal in the Solar Spectrum (6S) radiative transfer model for atmospheric correction, and kernel-based semi-empirical bidirectional reflectance distribution function (BRDF) model to remove surface anisotropic effects. From this, top-of-atmosphere, top-of-canopy, and normalized NDVIs were produced. A sensitivity analysis showed that the normalized NDVI had the lowest number of missing values compared with the others and almost no low peaks during the study period. These results were validated by Terra and Aqua/Moderate Resolution Imaging Spectroradiometer (MODIS) and Project for On-Board Autonomy/Vegetation (PROBA) NDVI product, showing the root mean square error (RMSE) and bias of 0.09 and + 0.04 (MODIS) and 0.09 and − 0.04 (PROBA), respectively. These results also satisfied the FP7 Geoland2/BioPar project-defined user requirements (threshold: 0.15; target: 0.10).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.