No abstract
Input is a significant problem for wearable systems, particularly for head mounted virtual and augmented reality displays. Existing input techniques either lack expressive power or may not be socially acceptable. As an alternative, thumb-to-finger touches present a promising input mechanism that is subtle yet capable of complex interactions. We present DigiTouch, a reconfigurable glove-based input device that enables thumb-to-finger touch interaction by sensing continuous touch position and pressure. Our novel sensing technique improves the reliability of continuous touch tracking and estimating pressure on resistive fabric interfaces. We demonstrate DigiTouch’s utility by enabling a set of easily reachable and reconfigurable widgets such as buttons and sliders. Since DigiTouch senses continuous touch position, widget layouts can be customized according to user preferences and application needs. As an example of a real-world application of this reconfigurable input device, we examine a split-QWERTY keyboard layout mapped to the user’s fingers. We evaluate DigiTouch for text entry using a multi-session study. With our continuous sensing method, users reliably learned to type and achieved a mean typing speed of 16.0 words per minute at the end of ten 20-minute sessions, an improvement over similar wearable touch systems.
Centimeter scale mobile biobots offer unique advantages in uncertain environments. Our previous experimentation has demonstrated neural stimulation techniques in order to control the motion of Madagascar hissing cockroaches. These trials relied on stimulation by a human operator using a remote control. We have developed a Kinect-based system for computer operated automatic control of cockroaches. Using image processing techniques and a radio transmitter, this platform both detects the position of the roach biobot and sends stimulation commands to an implanted microcontroller-based receiver. The work presented here enables repeatable experimentation and allows precise quantification of the line following capabilities of the roach biobot. This system will help refine our model for the stimulation response of the insect and improve our ability to direct them in increasingly dynamic situations.
Before a person suffering from a traumatic brain injury reaches a medical facility, measuring their pupillary light reflex (PLR) is one of the few quantitative measures a clinician can use to predict their outcome. We propose PupilScreen, a smartphone app and accompanying 3D-printed box that combines the repeatability, accuracy, and precision of a clinical device with the ubiquity and convenience of the penlight test that clinicians regularly use in emergency situations. The PupilScreen app stimulates the patient's eyes using the smartphone's flash and records the response using the camera. The PupilScreen box, akin to a head-mounted virtual reality display, controls the eyes' exposure to light. The recorded video is processed using convolutional neural networks that track the pupil diameter over time, allowing for the derivation of clinically relevant measures. We tested two different network architectures and found that a fully convolutional neural network was able to track pupil diameter with a median error of 0.30 mm. We also conducted a pilot clinical evaluation with six patients who had suffered a TBI and found that clinicians were almost perfect when separating unhealthy pupillary light reflexes from healthy ones using PupilScreen alone.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.