Roughness is the most salient perceptual dimension of surface texture but has no well-defined physical basis. We seek to determine the neural determinants of tactile roughness in the somatosensory nerves. Specifically, we record the patterns of activation evoked in tactile nerve fibers of anesthetized Rhesus macaques to a large and diverse set of natural textures and assess what aspect of these patterns of activation can account for psychophysical judgments of roughness, obtained from human observers. We show that perceived roughness is determined by the variation in the population response, weighted by fiber type. That is, a surface will feel rough to the extent that the activity varies across nerve fibers and varies across time within nerve fibers. We show that this variation-based neural code can account not only for magnitude estimates of roughness but also for roughness discrimination performance. Our sense of touch endows us with an exquisite sensitivity to the microstructure of surfaces, the most salient aspect of which is roughness. We analyze the responses evoked in tactile fibers of monkeys by natural textures and compare them to judgments of roughness obtained for the same textures from human observers. We then describe how texture signals from three populations of nerve fibers are integrated to culminate in a percept of roughness.
Tactile feature extraction is essential to guide the dexterous manipulation of objects. The longstanding theory is that geometric features at each location of contact between hand and object are extracted from the spatial layout of the response of populations of tactile nerve fibers. However, recent evidence suggests that some features (e.g., edge orientation) are extracted very rapidly (<200 ms), casting doubt that this information relies on a spatial code, which ostensibly requires integrating responses over time. An alternative hypothesis is that orientation is conveyed in precise temporal spiking patterns. Here we simulate, using a recently developed and validated model, the responses of the two relevant subpopulations of tactile fibers from the entire human fingertip (~800 afferents) to edges indented into the skin. We show that edge orientation can be quickly (<50 ms) and accurately (<3°) decoded from the spatial pattern of activation across the afferent population, starting with the very first spike. Next, we implement a biomimetic decoder of edge orientation, consisting of a bank of oriented Gabor filters, designed to mimic the documented responses of cortical neurons. We find that the biomimetic approach leads to orientation decoding performance that approaches the limit set by optimal decoders and is actually more robust to changes in other stimulus features. Finally, we show that orientation signals, measured from single units in the somatosensory cortex of nonhuman primates (2 macaque monkeys, 1 female), follow a time course consistent with that of their counterparts in the nerve. We conclude that a spatial code is fast and accurate enough to support object manipulation. NEW & NOTEWORTHY The dexterous manipulation of objects relies on the rapid and accurate extraction of the objects’ geometric features by the sense of touch. Here we simulate the responses of all the nerve fibers that innervate the fingertip when an edge is indented into the skin and characterize the time course over which signals about its orientation evolve in this neural population. We show that orientation can be rapidly and accurately decoded from the spatial pattern of afferent activation using spatial filters that mimic the response properties of neurons in cortical somatosensory neurons along a time course consistent with that observed in cortex. We conclude that the classical model of tactile feature extraction is rapid and accurate enough to support object manipulation.
Tactile feature extraction is essential to guide the dexterous manipulation of objects. The longstanding theory is that geometric features at each location of contact between hand and object are extracted from the spatial layout of the response of populations of tactile nerve fibers. However, recent evidence suggests that some features (edge orientation, e.g.) are extracted very rapidly (<200ms), casting doubt that this information relies on a spatial code, which ostensibly requires integrating responses over time. An alternative hypothesis is that orientation is conveyed in precise temporal spiking patterns. Here, we simulate, using a recently developed and validated model, the responses of tactile fibers from the entire human fingertip (∼800 afferents) to edges indented into the skin. We show that edge orientation can be quickly (<50 ms) and accurately (<3°) decoded from the spatial pattern of activation across the afferent population, starting with the very first spike. Next, we implement a biomimetic decoder of edge orientation, consisting of a bank of oriented Gabor filters, designed to mimic the documented responses of cortical neurons. We find that the biomimetic approach leads to orientation decoding performance that approaches the limit set by optimal decoders and is actually more robust to changes in other stimulus features. Finally, we show that orientation signals, measured from single units in non-human primate cortex (2 macaque monkeys, 1 female), follow a time course consistent with that of their counterparts in the nerve. We conclude that a spatial code is fast and accurate enough to support object manipulation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.