The tactful networking paradigm is expected to play a crucial role in the next generation networks. Accordingly, adaptive human-aware environments, sensitive to the daily human behavior and individual traits have to be provided, in order to offer a fully immersive and customized experience to users. On the basis of data collected by actual cognitive experiments, this paper proposes a learning framework to discover the multi-sensory human perceptual experience. The paper applies the mixture density network to identify the perception model considering different senses, and then the multi-sensory integration is performed, accordingly to the actual neuro-cognitive model. Furthermore, a supervised learning module has been used to cluster the users on the basis of the human perception identification strategy previously designed, assuming a multimodal structure for the cognitive brain activity. Finally, a practical contextualization has been presented, in relation to the haptics virtual reality services. What emerges from the results is the effectiveness of the tactful approach, i.e., brain-aware, involving the proposed framework, which is validated in comparison to the more conventional brain-agnostic scheme. In fact, the system performance, expressed in terms of reliability in guaranteeing the service exploitation before a target deadline based on the integrated perception, reaches remarkable improvements applying the brain-aware strategy, which exploits the human perception knowledge.