In the present study, we examined the eye movement behaviour of children and adults looking at five Van Gogh paintings in the Van Gogh Museum, Amsterdam. The goal of the study was to determine the role of top-down and bottom-up attentional processes in the first stages of participants’ aesthetic experience. Bottom-up processes were quantified by determining a salience map for each painting. Top-down processing was manipulated by first allowing participants to view the paintings freely, then providing background information about each painting, and then allowing them to view the paintings a second time. The salience analysis showed differences between the eye movement behaviour of children and adults, and differences between the two phases. In the children, the first five fixations during the free viewing phase were strongly related to visually salient features of the paintings—indicating a strong role for bottom-up factors. In the second phase, after children had received background information, top-down factors played a more prominent role. By contrast, adults’ observed patterns were similar in both phases, indicating that bottom-up processes did not play a major role when they viewed the paintings. In the second phase, children and adults both spent more time looking at regions that were mentioned in the background information. This effect was greater for adults than for children, confirming the notion that adults, when viewing paintings, rely much more on top-down processing than children.
Adopting the perspective of another person is an important aspect of social cognition and has been shown to depend on multisensory signals from one’s own body. Recent work suggests that interoceptive signals not only contribute to own-body perception and self-consciousness, but also to empathy. Here we investigated if social cognition – in particular adopting the perspective of another person – can be altered by a systematic manipulation of interoceptive cues and further, if this effect depends on empathic ability. The own-body transformation task (OBT) – wherein participants are instructed to imagine taking the perspective and position of a virtual body presented on a computer screen – offers an effective way to measure reaction time differences linked to the mental effort of taking an other’s perspective. Here, we adapted the OBT with the flashing of a silhouette surrounding the virtual body, either synchronously or asynchronously with the timing of participants’ heartbeats. We evaluated the impact of this cardio-visual synchrony on reaction times and accuracy rates in the OBT. Empathy was assessed with the empathy quotient (EQ) questionnaire. Based on previous work using the cardio-visual paradigm, we predicted that synchronous (vs. asynchronous) cardio-visual stimulation would increase self-identification with the virtual body and facilitate participants’ ability to adopt the virtual body’s perspective, thereby enhancing performance on the task, particularly in participants with higher empathy scores. We report that participants with high empathy showed significantly better performance during the OBT task during synchronous versus asynchronous cardio-visual stimulation. Moreover, we found a significant positive correlation between empathic ability and the synchrony effect (the difference in reaction times between the asynchronous and synchronous conditions). We conclude that synchronous cardio-visual stimulation between the participant’s body and a virtual body during an OBT task makes it easier to adopt the virtual body’s perspective, presumably based on multisensory integration processes. However, this effect depended on empathic ability, suggesting that empathy, interoception and social perspective taking are inherently linked.
In road-crossing situations involving negotiation with approaching vehicles, pedestrians need to take into account the behavior of the car before making a decision. To investigate the kind of information about the car that pedestrians seek, and the places where do they look for it, we conducted an eye-tracking study with 26 participants and analyzed the fixation behavior when interacting with a manually-driven vehicle that approached while slowing and displaying a yielding behavior. Results show that a clear pattern of gaze behavior exists for pedestrians in looking at a vehicle during road-crossing situations as a function of the vehicle's distance. When the car is far away, pedestrians look at the environment or the road space ahead of the car. With the approach, the gaze gradually shifts to the windshield of the car. We conclude by discussing the implications of this insight in the user-centered-design of optimal external Human-Machine-Interfaces for automated vehicles.
Overtrust and undertrust are major issues with partially automated vehicles. Ideally, trust should be calibrated ensuring that drivers’ subjective feelings of safety match the objective reliability of the vehicle. In the present study, we examined if drivers’ trust toward Level 2 cars changed after on-road experience. Drivers’ self-reported trust was assessed three times: before having experience with these vehicles, immediately after driving two types of vehicles, and two weeks after the driving experience. Analysis of the results showed major changes in trust scores after the on-road driving experience. Before experiencing the vehicles, participants tended to overestimate the vehicle capabilities. Afterwards they had a better understanding of vehicles’ limitations, resulting in better calibrated trust.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.