Advanced human–machine interfaces render robotic devices applicable to study and enhance human cognition. This turns robots into formidable neuroscientific tools to study processes such as the adaptation between a human operator and the operated robotic device and how this adaptation modulates human embodiment and embodied cognition. We analyze bidirectional human–machine interface (bHMI) technologies for transparent information transfer between a human and a robot via efferent and afferent channels. Even if such interfaces have a tremendous positive impact on feedback loops and embodiment, advanced bHMIs face immense technological challenges. We critically discuss existing technical approaches, mainly focusing on haptics, and suggest extensions thereof, which include other aspects of touch. Moreover, we point out other potential constraints such as limited functionality, semi‐autonomy, intent‐detection, and feedback methods. From this, we develop a research roadmap to guide understanding and development of bidirectional human–machine interfaces that enable robotic experiments to empirically study the human mind and embodiment. We conclude the integration of dexterous control and multisensory feedback to be a promising roadmap towards future robotic interfaces, especially regarding applications in the cognitive sciences.
This article is categorized under:
Computer Science > Robotics
Psychology > Motor Skill and Performance
Neuroscience > Plasticity