Background
Virtual reality (VR) devices are increasingly being used in medicine and other areas for a broad spectrum of applications. One of the possible applications of VR involves the creation of an environment manipulated in a way that helps patients with disturbances in the spatial allocation of visual attention (so-called hemispatial neglect). One approach to ameliorate neglect is to apply cross-modal cues (ie, cues in sensory modalities other than the visual one, eg, auditory and tactile) to guide visual attention toward the neglected space. So far, no study has investigated the effects of audio-tactile cues in VR on the spatial deployment of visual attention in neglect patients.
Objective
This pilot study aimed to investigate the feasibility and usability of multimodal (audio-tactile) cueing, as implemented in a 3D VR setting, in patients with neglect, and obtain preliminary results concerning the effects of different types of cues on visual attention allocation compared with noncued conditions.
Methods
Patients were placed in a virtual environment using a head-mounted display (HMD). The inlay of the HMD was equipped to deliver tactile feedback to the forehead. The task was to find and flag appearing birds. The birds could appear at 4 different presentation angles (lateral and paracentral on the left and right sides), and with (auditory, tactile, or audio-tactile cue) or without (no cue) a spatially meaningful cue. The task usability and feasibility, and 2 simple in-task measures (performance and early orientation) were assessed in 12 right-hemispheric stroke patients with neglect (5 with and 7 without additional somatosensory impairment).
Results
The new VR setup showed high usability (mean score 10.2, SD 1.85; maximum score 12) and no relevant side effects (mean score 0.833, SD 0.834; maximum score 21). A repeated measures ANOVA on task performance data, with presentation angle, cue type, and group as factors, revealed a significant main effect of cue type (F30,3=9.863; P<.001) and a significant 3-way interaction (F90,9=2.057; P=.04). Post-hoc analyses revealed that among patients without somatosensory impairment, any cue led to better performance compared with no cue, for targets on the left side, and audio-tactile cues did not seem to have additive effects. Among patients with somatosensory impairment, performance was better with both auditory and audio-tactile cueing than with no cue, at every presentation angle; conversely, tactile cueing alone had no significant effect at any presentation angle. Analysis of early orientation data showed that any type of cue triggered better orientation in both groups for lateral presentation angles, possibly reflecting an early alerting effect.
Conclusions
Overall, audio-tactile cueing seems to be a promising method to guide patient attention. For instance, in the future, it could be used as an add-on method that supports attentional orientation during established therapeutic approaches.