Perhaps the most recognizable sensory map in all of neuroscience is the somatosensory homunculus. Though it seems straightforward, this simple representation belies the complex link between an activation in somatosensory Area 3b and the associated touch location on the body. Any isolated activation is spatially ambiguous without a neural decoder that can read its position within the entire map, though how this is computed by neural networks is unknown.We propose that somatosensory cortex implements multilateration, a common computation used by surveying and GPS systems to localize objects. Specifically, to decode touch location on the body, the somatosensory system estimates the relative distance between the afferent input and the body's joints. We show that a simple feedforward neural network which captures the receptive field properties of somatosensory cortex implements a Bayes-optimal multilateral decoder via a combination of bell-shaped (Area 3b) and sigmoidal (Areas 1/2) tuning curves. Simulations demonstrated that this decoder produced a unique pattern of localization variability between two joints that was not produced by other known neural decoders. Finally, we identify this neural signature of multilateration in actual psychophysical experiments, suggesting that it is a candidate computational mechanism underlying tactile localization. take place in the frontal and parietal cortices (Burnod et al., 1999;Crawford et al., 2004;Medendorp et al., 2005;Pesaran et al., 2006).Equally crucial to localizing objects in the environment is localizing objects on the personal space of the body. Despite over 180 years of research on the sense of touch (Weber, 1834), the computations underlying tactile localization remain largely unknown. Recent accounts have suggested that tactile localization requires two computational steps Medina and Coslett, 2010). First, afferent input must be localized within a topographic map in