Various living creatures exhibit embodiment intelligence, which is reflected by a collaborative interaction of the brain, body, and environment. The actual behavior of embodiment intelligence is generated by a continuous and dynamic interaction between a subject and the environment through information perception and physical manipulation. The physical interaction between a robot and the environment is the basis for realizing embodied perception and learning. Tactile information plays a critical role in this physical interaction process. It can be used to ensure safety, stability, and compliance, and can provide unique information that is difficult to capture using other perception modalities. However, due to the limitations of existing sensors and perception and learning methods, the development of robotic tactile research lags significantly behind other sensing modalities, such as vision and hearing, thereby seriously restricting the development of robotic embodiment intelligence. This paper presents the current challenges related to robotic tactile embodiment intelligence and reviews the theory and methods of robotic embodied tactile intelligence. Tactile perception and learning methods for embodiment intelligence can be designed based on the development of new large‐scale tactile array sensing devices, with the aim to make breakthroughs in the neuromorphic computing technology of tactile intelligence.