The sense of touch is considered as an essential feature for robots in order to improve the quality of their physical and social interactions. For instance, tactile devices have to be fast enough to interact in real-time, robust against noise to process rough sensory information as well as adaptive to represent the structure and topography of a tactile sensor itself-i.e., the shape of the sensor surface and its dynamic resolution. In this paper, we conduct experiments with a self-organizing map (SOM) neural network that adapts to the structure of a tactile sheet and spatial resolution of the input tactile device; this adaptation is faster and more robust against noise than image reconstruction techniques based on Electrical Impedance Tomography (EIT). Other advantages of this bio-inspired reconstruction algorithm are its simple mathematical formulation and the ability to self-calibrate its topographical organization without any a priori information about the input dynamics. Our results show that the spatial patterns of simple and multiple contact points can be acquired and localized with enough speed and precision for pattern recognition tasks during physical contact.