Controlling remote robots is a difficult task for human computer interface (HCI). Control of remote robots enables accomplishment of tasks without the human controller physically present due to safety concerns or the expert cannot be physically present. This paper presents a method for using an Oculus Rift to improve HCI for telerobotic control. Using the Oculus, an operator could become immersed in the robot’s environment and could more naturally control the desired position of a remotely positioned vision system via head movements. To provide the appropriate visual feedback, a three-axis gimbal was implemented as a test platform. Through software implemented motion tracking, the response of the Oculus was compared to that of a mouse which demonstrates the efficiency of the proposed system over comparable HCI.
Human Computer Interaction (HCI) is central for many applications, including hazardous environment inspection and telemedicine. Whereas traditional methods ofHCI for teleoperating electromechanical systems include joysticks, levers, or buttons, our research focuses on using electromyography (EMG) signals to improve intuition and response time. An important challenge is to accurately and efficiently extract and map EMG signals to known position for real-time control. In this preliminary work, we compare the accuracy and real-time performance of several machine-learning techniques for recognizing specific arm positions. We present results from offline analysis, as well as end-to-end operation using a robotic arm.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.