Along with the invention of portable devices, such as smartphones and tablets, computerbased touch screen sensor-assistive technologies have become significantly more affordable than traditional tactile graphics. The sensor panel in these technologies allows users to receive visual and auditorial responses via interaction with the device. However, visually impaired individuals (with a lack or loss of ability to see) will not find visual responses useful when using tablets or smartphones. Therefore, in this paper we propose a system that helps visually impaired people comprehend information on electronic devices with the help of auditory action feedback. We develop a multimedia system for sound production from a given image via object detection. In this study, YOLO (You Only Look Once) is used in object detection for sonification. A pretrained model is used; thus, a wider range of object classification can be identified. The system generates the corresponding sound when an object on the sensor screen is touched. The purpose of our research is to aid visually impaired people to perceive information of a picture shown on the device by touching the detected object. The device was tested by simulating visually impaired people by blindfolding people with normal vision, who filled out questionnaires on its performance. The results indicate that most of the users found that the sound presented by the device was helpful for telling them what the shown image was.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.