Human touch is an inherently active sense: to estimate an object’s shape humans often move their hand across its surface. This way the object is sampled both in a serial (sampling different parts of the object across time) and parallel fashion (sampling using different parts of the hand simultaneously). Both the serial (moving a single finger) and parallel (static contact with the entire hand) exploration modes provide reliable and similar global shape information, suggesting the possibility that this information is shared early in the sensory cortex. In contrast, we here show the opposite. Using an adaptation-and-transfer paradigm, a change in haptic perception was induced by slant-adaptation using either the serial or parallel exploration mode. A unified shape-based coding would predict that this would equally affect perception using other exploration modes. However, we found that adaptation-induced perceptual changes did not transfer between exploration modes. Instead, serial and parallel exploration components adapted simultaneously, but to different kinaesthetic aspects of exploration behaviour rather than object-shape per se. These results indicate that a potential combination of information from different exploration modes can only occur at down-stream cortical processing stages, at which adaptation is no longer effective.
We experience the world mostly in a multisensory fashion using a combination of all of our senses. Depending on the modality we can select different exploration strategies for extracting perceptual information. For instance, using touch we can enclose an object in our hand to explore parts of the object in parallel. Alternatively, we can trace the object with a single finger to explore its parts in a serial fashion. In this study we investigated whether the exploration mode (parallel vs. serial) affects the way sensory signals are combined. To this end, participants visually and haptically explored surfaces that varied in roll angle and indicated which side of the surface was perceived as higher. In Experiment 1, the exploration mode was the same for both modalities (i.e., both parallel or both serial). In Experiment 2, we introduced a difference in exploration mode between the two modalities (visual exploration was parallel while haptic exploration was serial or vice versa). The results showed that visual and haptic signals were combined in a statistically optimal fashion only when the exploration modes were the same. In case of an asymmetry in the exploration modes across modalities, integration was suboptimal. This indicates that spatial-temporal discrepancies in the acquisition of information in the two senses (i.e., haptic and visual) can lead to the breakdown of sensory integration.
When picking up objects using a pinch grip, there are usually numerous places at which one could place the thumb and index finger. Yet, people seem to consistently place them at or close to the centre of mass (COM), presumably to minimize torque and therefore the required grip force. People also prefer to grasp objects by parallel surfaces and ones with higher friction coefficients (rough surfaces), to prevent the object from slipping when they lift it. Here, we examine the trade-off between friction and COM. Participants were asked to grasp and lift aluminium bars of which one end was polished and therefore smooth and the other was rough. Their finger positions were recorded to determine how they grasped the objects. The bars were oriented horizontally in the frontal plane, with the centre aligned with the participants' body midline. The bars varied in the horizontal offset between the COM and the edge of the rough region. The offset could be 0, 1 or 2 cm. We expected participants to grasp closer to the rough area than the centre of the bar. Completely rough bars and completely smooth bars served as control conditions. The slipperiness of the surface that was grasped affected the height of the grasping points, indicating that participants adjusted their grasping behaviour to the slipperiness of the surface. However, the tendency to grasp closer to the rough area was minimal. This shows that the judged COM largely determines how an object is grasped. Friction has very limited influence.
In our daily life, we often interact with objects using both hands raising the question the question to what extent information between the hands is shared. It has, for instance, been shown that curvature adaptation aftereffects can transfer from the adapted hand to the nonadapted hand. However, this transfer only occurred for dynamic exploration, e.g. by moving a single finger over a surface, but not for static exploration when keeping static contact with the surface and combining the information from different parts of the hand. This raises the question to what extent adaptation to object shape is shared between the hands when both hands are used in static fashion simultaneously and the object shape estimates require information from both hands. Here we addressed this question in three experiments using a slant adaptation paradigm. In Experiment 1 we investigated whether an aftereffect of static bimanual adaptation occurs at all and whether it transfers to conditions in which one hand was moving. In Experiment 2 participants adapted either to a felt slanted surface or simply be holding their hands in mid-air at similar positions, to investigate to what extent the effects of static bimanual adaptation are posture-based rather than object based. Experiment 3 further explored the idea that bimanual adaptation is largely posture based. We found that bimanual adaptation using static touch did lead to aftereffects when using the same static exploration mode for testing. However, the aftereffect did not transfer to any exploration mode that included a dynamic component. Moreover, we found similar aftereffects both with and without a haptic surface. Thus, we conclude that static bimanual adaptation is of proprioceptive nature and does not occur at the level at which the object is represented.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.