Despite its essential role in human coexistence, the developmental origins and progression of sympathy in infancy are not yet fully understood. We show that preverbal 10-month-olds manifest sympathetic responses, evinced in their preference for attacked others according to their evaluations of the respective roles of victim, aggressor, and neutral party. In Experiment 1, infants viewing an aggressive social interaction between a victim and an aggressor exhibited preference for the victim. In Experiment 2, when comparing the victim and the aggressor to a neutral object, infants preferred the victim and avoided the aggressor. These findings indicate that 10-month-olds not only evaluate the roles of victims and aggressors in interactions but also show rudimentary sympathy toward others in distress based on that evaluation. This simple preference may function as a foundation for full-fledged sympathetic behavior later on.
This study provides the first physiological evidence of humans’ ability to empathize with robot pain and highlights the difference in empathy for humans and robots. We performed electroencephalography in 15 healthy adults who observed either human- or robot-hand pictures in painful or non-painful situations such as a finger cut by a knife. We found that the descending phase of the P3 component was larger for the painful stimuli than the non-painful stimuli, regardless of whether the hand belonged to a human or robot. In contrast, the ascending phase of the P3 component at the frontal-central electrodes was increased by painful human stimuli but not painful robot stimuli, though the interaction of ANOVA was not significant, but marginal. These results suggest that we empathize with humanoid robots in late top-down processing similarly to human others. However, the beginning of the top-down process of empathy is weaker for robots than for humans.
Body ownership can be modulated through illusory visual-tactile integration or visual-motor synchronicity/contingency. Recently, it has been reported that illusory ownership of an invisible body can be induced by illusory visual-tactile integration from a first-person view. We aimed to test whether a similar illusory ownership of the invisible body could be induced by the active method of visual-motor synchronicity and if the illusory invisible body could be experienced in front of and facing away from the observer. Participants observed left and right white gloves and socks in front of them, at a distance of 2 m, in a virtual room through a head-mounted display. The white gloves and socks were synchronized with the observers’ actions. In the experiments, we tested the effect of synchronization, and compared this to a whole-body avatar, measuring self-localization drift. We observed that visual hands and feet were sufficient to induce illusory body ownership, and this effect was as strong as using a whole-body avatar.
The spatial luminance relationship between shading patterns and specular highlight is suggested to be a cue for perceptual translucency (Motoyoshi, 2010). Although local image features are also important for translucency perception (Fleming & Bulthoff, 2005), they have rarely been investigated. Here, we aimed to extract spatial regions related to translucency perception from computer graphics (CG) images of objects using a psychophysical reverse-correlation method. From many trials in which the observer compared the perceptual translucency of two CG images, we obtained translucency-related patterns showing which image regions were related to perceptual translucency judgments. An analysis of the luminance statistics calculated within these image regions showed that (1) the global rms contrast within an entire CG image was not related to perceptual translucency and (2) the local mean luminance of specific image regions within the CG images correlated well with perceptual translucency. However, the image regions contributing to perceptual translucency differed greatly between observers. These results suggest that perceptual translucency does not rely on global luminance statistics such as global rms contrast, but rather depends on local image features within specific image regions. There may be some “hot spots” effective for perceptual translucency, although which of many hot spots are used in judging translucency may be observer dependent.
Attentional effects on self-motion perception (vection) were examined by using a large display in which vertical stripes containing upward or downward moving dots were interleaved to balance the total motion energy for the two directions. The dots moving in the same direction had the same colour, and subjects were asked to attend to one of the two colours. Vection was perceived in the direction opposite to that of non-attended motion. This indicates that non-attended visual motion dominates vection. The attentional effect was then compared with effects of relative depth. Clear attentional effects were again found when there was no relative depth between dots moving in opposite directions, but the effect of depth was much stronger for stimuli with a relative depth. Vection was mainly determined by motion in the far depth plane, although some attentional effects were evident even in this case. These results indicate that attentional modulation for vection exists, but that it is overridden when there is a relative depth between the two motion components.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.