We discuss unique features of lens-free computational imaging tools and report some of their emerging results for wide-field on-chip microscopy, such as the achievement of a numerical aperture (NA) of ~0.8–0.9 across a field of view (FOV) of more than 20 mm2 or an NA of ~0.1 across a FOV of ~18 cm2, which corresponds to an image with more than 1.5 gigapixels. We also discuss the current challenges that these computational on-chip microscopes face, shedding light on their future directions and applications.
Optical imaging of nanoscale objects, whether it is based on scattering or fluorescence, is a challenging task due to reduced detection signal-to-noise ratio and contrast at subwavelength dimensions. Here, we report a field-portable fluorescence microscopy platform installed on a smart phone for imaging of individual nanoparticles as well as viruses using a lightweight and compact opto-mechanical attachment to the existing camera module of the cell phone. This hand-held fluorescent imaging device utilizes (i) a compact 450 nm laser diode that creates oblique excitation on the sample plane with an incidence angle of ~75°, (ii) a long-pass thin-film interference filter to reject the scattered excitation light, (iii) an external lens creating 2× optical magnification, and (iv) a translation stage for focus adjustment. We tested the imaging performance of this smart-phone-enabled microscopy platform by detecting isolated 100 nm fluorescent particles as well as individual human cytomegaloviruses that are fluorescently labeled. The size of each detected nano-object on the cell phone platform was validated using scanning electron microscopy images of the same samples. This field-portable fluorescence microscopy attachment to the cell phone, weighing only ~186 g, could be used for specific and sensitive imaging of subwavelength objects including various bacteria and viruses and, therefore, could provide a valuable platform for the practice of nanotechnology in field settings and for conducting viral load measurements and other biomedical tests even in remote and resource-limited environments.
We demonstrate that a deep neural network can significantly improve optical microscopy, enhancing its spatial resolution over a large field-of-view and depth-of-field. After its training, the only input to this network is an image acquired using a regular optical microscope, without any changes to its design. We blindly tested this deep learning approach using various tissue samples that are imaged with low-resolution and wide-field systems, where the network rapidly outputs an image with remarkably better resolution, matching the performance of higher numerical aperture lenses, also significantly surpassing their limited field-of-view and depth-offield. These results are transformative for various fields that use microscopy tools, including e.g., life sciences, where optical microscopy is considered as one of the most widely used and deployed techniques. Beyond such applications, our presented approach is broadly applicable to other imaging modalities, also spanning different parts of the electromagnetic spectrum, and can be used to design computational imagers that get better and better as they continue to image specimen and establish new transformations among different modes of imaging.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.