Diagnostic self-tracking, the recording of personal information to diagnose or manage a health condition, is a common practice, especially for people with chronic conditions. Unfortunately, many who attempt diagnostic self-tracking have trouble accomplishing their goals. People often lack knowledge and skills needed to design and conduct scientifically rigorous experiments, and current tools provide little support. To address these shortcomings and explore opportunities for diagnostic self-tracking, we designed, developed, and evaluated a mobile app that applies a self-experimentation framework to support patients suffering from irritable bowel syndrome (IBS) in identifying their personal food triggers. TummyTrials aids a person in designing, executing, and analyzing self-experiments to evaluate whether a specific food triggers their symptoms. We examined the feasibility of this approach in a field study with 15 IBS patients, finding that participants could use the tool to reliably undergo a self-experiment. However, we also discovered an underlying tension between scientific validity and the lived experience of self-experimentation. We discuss challenges of applying clinical research methods in everyday life, motivating a need for the design of self-experimentation systems to balance rigor with the uncertainties of everyday life.
We discuss the development of Tactile Graphics with a Voice (TGV), a system used to access label information in tactile graphics using QR codes. Blind students often rely on tactile graphics to access textbook images. Many textbook images have a large number of text labels that need to be made accessible. In order to do so, we propose TGV, which uses QR codes to replace the text, as an alternative to Braille. The codes are read with a smartphone application. We evaluated the system with a longitudinal study where 10 blind and low-vision participants completed tasks using three different modes on the smartphone application: (1) no guidance, (2) verbal guidance, and (3) finger-pointing guidance. Our results show that TGV is an effective way to access text in tactile graphics, especially for those blind users who are not fluent in Braille. We also found that preferences varied greatly across the modes, indicating that future work should support multiple modes. We expand upon the algorithms we used to implement the finger pointing, algorithms to automatically place QR codes on documents. We also discuss work we have started on creating a Google Glass version of the application.
Textbook figures are often converted into a tactile format for access by blind students. These figures are not truly accessible unless the text within the figures is also made accessible. A common solution to access text in a tactile image is to use embossed Braille. We have developed an alternative to Braille that uses QR codes for students who want tactile graphics, but prefer the text in figures be spoken, rather than in Braille. Tactile Graphics with a Voice (TGV) allows text within tactile graphics to be accessible by using a talking QR code reader app on a smartphone. To evaluate TGV, we performed a longitudinal study where ten blind and low vision participants were asked to complete tasks using three alternative picture taking guidance techniques: 1) no guidance, 2) verbal guidance, and 3) finger pointing guidance. Our results show that TGV is an effective way to access text in tactile graphics, especially for those blind users who are not fluent in Braille. In addition, guidance preferences varied with each of the guidance techniques being preferred by at least one participant.
Textbook images are converted into tactile graphics to be made accessible to blind and low vision students. The text labels on these graphics are an important part of the image and must be made accessible as well. The graphics usually have the labels embossed in Braille. However, there are some blind and low vision students who cannot read Braille and need to be able to access the labels in a different manner. We present Tactile Graphics with a Voice, a system that encodes the labels in QR codes, which can be read aloud using the application, TGV, we developed. TGV provides feedback to support the user in scanning the QR code and allows the user to select which QR code to scan when multiple are close together.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.