Microscopic observations of cultured cells in many lab-on-a-chip applications mostly utilize digital image acquisition using CCD sensors connected to a personal computer. The functionalities of this digital imaging can be enhanced by implementing computer-vision based augmented reality technologies. In this study, we present a new method for precisely relocating biological specimens under microscopic inspections by using augmented reality patterns, called microscopic augmented reality indicators (μ-ARIs). Since the method only requires sticky films attached under sample containers of any shape, long-term live cell observations can be conducted at much less extra cost than with conventional methods. On these sticky films, multiple arrays of position-indicating patterns were imprinted to provide a reference coordinate system for recording and relocating the accurate position and rotation of the specimen under inspection. This approach can be useful for obtaining the exact locations of individual cells inside biological samples using μ-ARI imprinted transparent films in a rapid and controlled manner.
A prototype system that replaces the conventional time-lapse imaging in microscopic inspection for use with smartphones is presented. Existing time-lapse imaging requires a video data feed between a microscope and a computer that varies depending on the type of image grabber. Even with proper hardware setups, a series of tedious and repetitive tasks is still required to relocate to the region-of-interest (ROI) of the specimens. In order to simplify the system and improve the efficiency of time-lapse imaging tasks, a smartphone-based platform utilizing microscopic augmented reality (μ-AR) markers is proposed. To evaluate the feasibility and efficiency of the proposed system, a user test was designed and performed, measuring the elapse time for a trial of the task starting from the execution of the application software to the completion of restoring and imaging of an ROI saved in advance. The results of the user test showed that the average elapse time was 65.3 ± 15.2 s with 6.86 ± 3.61 μm of position error and 0.08 ± 0.40 degrees of angle error. This indicates that the time-lapse imaging task was accomplished rapidly with a high level of accuracy. Thus, simplification of both the system and the task was achieved via our proposed system.
This study proposes a novel way to achieve high-throughput image acquisition based on a computer-recognizable micro-pattern implemented on a microfluidic device. We integrated the QR code, a two-dimensional barcode system, onto the microfluidic device to simplify imaging of multiple ROIs (regions of interest). A standard QR code pattern was modified to arrays of cylindrical structures of polydimethylsiloxane (PDMS). Utilizing the recognition of the micro-pattern, the proposed system enables: (1) device identification, which allows referencing additional information of the device, such as device imaging sequences or the ROIs and (2) composing a coordinate system for an arbitrarily located microfluidic device with respect to the stage. Based on these functionalities, the proposed method performs one-step high-throughput imaging for data acquisition in microfluidic devices without further manual exploration and locating of the desired ROIs. In our experience, the proposed method significantly reduced the time for the preparation of an acquisition. We expect that the method will innovatively improve the prototype device data acquisition and analysis.
In this paper, we present a novel 'push-able' interface by utilizing a stretchable elastic screen. This interface enables an intuitive exploration through complex & multi-dimensional data structures. By deforming the elastic membrane of the screen, users can manipulate not only their points of interest, such as traditional mouse cursors, but also their surrounding regions as well. Also by its force of restoration, the elastic screen gives users a natural passive force feedback when it is stretched, which in turn makes more intuitive interactions possible. We have applied this system to several applications including a browser for computed tomography data of human body and a graph navigation scheme based on physical user interaction forces.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.