Smart wearable technologies are rapidly revolutionizing our interactive experience with the real and virtual world through a variety of virtual and augmented reality (VR/AR) systems [1,2] as well as underpinning advances in several technological areas touching life, such as healthcare technologies, [3,4] assistive technologies, [5][6][7] smart homes, [8] and robotics. [9][10][11] With the remarkable progress in related underpinning technologies such as 5G/6G communication, [12] the Internet of things (IoT), [13] tactile Internet, [14] machine learning, [15] and neural computing, [16] one can only see bright prospects for smart wearable technologies such as intelligent human machine interfaces (HMIs). [17] The HMI is at the center of efficient collaboration between humans and the rapidly advancing digitalized world. [18,19] Although human interaction with the real or virtual world takes place through the five basic sensory modalities, the majority of HMI technologies rely on vision, audio, and touch-sensing modalities. The latter is the major modality when it comes to physical interaction. [2,20,21] However, sensory feedback alone is not enough as the two-way tactile communication from the contact point to the controlling unit (e.g., the brain in the case of humans) [10,22] is important for any effective interaction. To achieve two-way tactile communication, an interface that has the capability of sensing a stimulus and receiving feedback is desired. For example, with interactive VR/AR systems, the user can send information via touch sensors and receive signals via different modalities, including visual, auditory, haptic, vestibular, and olfactory stimuli. VR tools that use computer-based interactive simulations to enable users to engage in environments that appear and feel similar to the real world are considered to be considerably useful for rehabilitation. [23,24] Likewise, the AR systems can infuse interactive virtual elements into the physical environment and thus supplement the real world. [23] The quality of the user experience with these systems depends on the intuitive interface offered by these systems, and their ability to provide easy-to-understand information. However, most VR/AR systems primarily provide visual and auditory feedback, which does not provide verisimilar immersive experiences. This is particularly unsuitable for people with impaired senses, such as the deaf and blind. [25] Conventionally, the HMI has involved gadgets such as keyboards, joysticks, mouses, screens, and Braille. However, the majority of these gadgets have one of more drawbacks, such as being bulky and nonintuitive, having low precision for monitoring human motions or transmitting complex commands, and having no capability to generate a feedback signal based on tactile sensing. As the requirement for richer, more versatile, and seamless interactions rises, researchers have explored several wearable options with sensors based on various sensing mechanisms such as capacitive, [9,[26][27][28][29]