Although the previous studies have shown that an emotional context may alter touch processing, it is not clear how visual contextual information modulates the sensory signals, and at what levels does this modulation take place. Therefore, we investigated how a toucher’s emotional expressions (anger, happiness, fear, and sadness) modulate touchee’s somatosensory-evoked potentials (SEPs) in different temporal ranges. Participants were presented with tactile stimulation appearing to originate from expressive characters in virtual reality. Touch processing was indexed using SEPs, and self-reports of touch experience were collected. Early potentials were found to be amplified after angry, happy and sad facial expressions, while late potentials were amplified after anger but attenuated after happiness. These effects were related to two stages of emotional modulation of tactile perception: anticipation and interpretation. The findings show that not only does touch affect emotion, but also emotional expressions affect touch perception. The affective modulation of touch was initially obtained as early as 25 ms after the touch onset suggesting that emotional context is integrated to the tactile sensation at a very early stage.
With the advent of consumer grade virtual reality (VR) headsets and physiological measurement devices, new possibilities for mediated social interaction emerge enabling the immersion to environments where the visual features react to the users' physiological activation. In this study, we investigated whether and how individual and interpersonally shared biofeedback (visualised respiration rate and frontal asymmetry of electroencephalography, EEG) enhance synchrony between the users' physiological activity and perceived empathy towards the other during a compassion meditation exercise carried out in a social VR setting. The study was conducted as a laboratory experiment (N=72) employing a Unity3D-based Dynecom immersive social meditation environment and two amplifiers to collect the psychophysiological signals for the biofeedback. The biofeedback on empathy-related EEG frontal asymmetry evoked higher self-reported empathy towards the other user than the biofeedback on respiratory activation, but the perceived empathy was highest when both feedbacks were simultaneously presented. In addition, the participants reported more empathy when there was stronger EEG frontal asymmetry synchronization between the users. The presented results inform the field of affective computing on the possibilities that VR offers for different applications of empathic technologies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.