2017
DOI: 10.1037/xhp0000358
|View full text |Cite
|
Sign up to set email alerts
|

On the origin of body-related influences on visual perception.

Abstract: The human body and the potential to move it affect the way we perceive the world. Here we explored a possible origin of such action-specific effects on perception. Participants were asked to enclose a virtual object by movements of their index finger and thumb and judged either the actual finger-thumb distance or the size of the virtual object subsequently. The visual-haptic discrepancy that comes with such virtual grasping resulted in a mutual impact of visual and body-related signals: the visual judgments of… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
21
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
9

Relationship

3
6

Authors

Journals

citations
Cited by 14 publications
(22 citation statements)
references
References 55 publications
(97 reference statements)
1
21
0
Order By: Relevance
“…However, a weaker form of multisensory integration, called sensory coupling (e.g., Bresciani, Dammeier, & Ernst, 2006), has also been observed in tool use, in particular in cursor-control tasks, where proprioception refers to the position of the hand and vision to the position of a cursor. Although the hand and cursor are clearly different objects, estimates of their respective positions become biased toward each other (Debats, Ernst, & Heuer, 2017a;Kirsch, Herbort, Ullrich, & Kunde, 2017;Kirsch, Pfister, & Kunde, 2016;Ladwig, Sutter, & Müsseler, 2012Rand & Heuer, 2013. These biases depend not only on the sensory input-that is, on the proprioceptive and visual information on hand and cursor position, respectively-but also on how the biases are assessed (cf.…”
mentioning
confidence: 99%
“…However, a weaker form of multisensory integration, called sensory coupling (e.g., Bresciani, Dammeier, & Ernst, 2006), has also been observed in tool use, in particular in cursor-control tasks, where proprioception refers to the position of the hand and vision to the position of a cursor. Although the hand and cursor are clearly different objects, estimates of their respective positions become biased toward each other (Debats, Ernst, & Heuer, 2017a;Kirsch, Herbort, Ullrich, & Kunde, 2017;Kirsch, Pfister, & Kunde, 2016;Ladwig, Sutter, & Müsseler, 2012Rand & Heuer, 2013. These biases depend not only on the sensory input-that is, on the proprioceptive and visual information on hand and cursor position, respectively-but also on how the biases are assessed (cf.…”
mentioning
confidence: 99%
“…We computed the median estimated distance per condition for each participant (regarding the use of similar method, see Kirsch et al, 2017). Moreover, given that distances for the near position and for the far position were different, we computed a bias ratio expressing the medians of the estimated distances as a ratio of medians of the actual distances to compare estimations in near and far P-T distances (see the Supplementary Tables S1, S2, available online).…”
Section: Resultsmentioning
confidence: 99%
“…Visual perception is known to rely on various sources of information, including visual information (Cutting and Vishton, 1995), physiological information (White et al, 2013; Witt and Riley, 2014), action intentions (Witt et al, 2010, 2004, 2005), as well as on multisensory integration processes (Campos et al, 2012, 2014; Kirsch et al, 2017). More precisely, these works show that visual and bodily variables are differently weighted during the estimation of space or object size, depending on the available sources of information.…”
Section: Discussionmentioning
confidence: 99%
“…This fact cannot be readily explained by a direct scaling approach. In one of our recent manuscripts we tackle this issue and suggest that body or action-related effects observed in visual judgments can, in principle, be explained by well-known mechanisms of multisensory integration [28]. In a series of experiments using a kind of virtual grasping and reaching tasks we observed that visual and somatic information is combined in judgments of the objects being grasped or reached as well as in judgments of body states (grasping posture or distance covered by the hand) even when the body and visual signals were spatially clearly separated.…”
Section: Discussionmentioning
confidence: 99%