The intensive care unit (ICU) is one of the most complex areas in hospital care, as patients require continuous monitoring by physicians and nurses. Currently, clinicians are informed about the patients’ physiological conditions through visual color-coded signals and auditory alarms. Previous studies have shown that vibrotactile cues can be used to inform clinicians of a patient’s vital signs status, either in a unisensory or multisensory alarm scheme. We present the results of the first in a series of experiments devoted to examining the feasibility to use tactile cues to convey detailed physiological information about more than one patient, rendered through a lower-leg tactile interface. The current experiment utilized a simulated clinical environment with 14 undergraduate students. Participants were required to interpret information delivered by the tactile interface, for two different patients, while they performed a continuous cognitively demanding task. Results indicate that under such conditions, it is possible to deliver critical information with a successful interpretation rate of approximately 85% but not without cost to the continuous demanding task. Future experiments should evaluate more tactile patterns in order to increase their interpretation success rate, and evaluate the use of these tactile cues with clinicians.
Vibro-tactile interfaces can support users in various aspects and contexts. Despite their inherent advantages, it is important to realize that they are limited in the type and capacity of information they can convey. This study is part of a series of experiments that aim to develop and evaluate a “tactile taxonomy” for dismounted operational environments. The current experiment includes a simulation of an operational mission with a remote Unmanned Ground Vehicle (UGV). During the mission, 20 participants were required to interpret notifications that they received in one (or more) of the following modalities: auditory, visual and/or tactile. Three specific notification types were chosen based on previous studies, in order to provide an intuitive connection between the notification and its semantic meaning. Response times to notifications, the ability to distinguish between the information types that they provided, and the operational mission performance metrics, were collected. Results indicate that it is possible to use a limited “tactile taxonomy” in a visually loaded and auditory noisy scene while performing a demanding operational task. The use of the tactile modality with other sensory modalities leverages the participants’ ability to perceive and identify the notifications.
In armored fighting vehicles (AFVs), the vehicle commander (VC) and crew communicate through an audio system that all crew members inside the AFV can hear. This can be distracting and inconvenient, especially for effective communication between the VC and the driver. We assessed the feasibility and usability of a tactile system for direct communication between the VC and the driver in addition to (or instead of) the existing auditory system. Field experiment results show that, with or without auditory commands, tactile cues can be utilized to direct AFV drivers during operational tasks. Hence, specific design and implementation of a tactile interface should be considered for internal communication traffic.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.