2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 2018
DOI: 10.1109/embc.2018.8512668
|View full text |Cite
|
Sign up to set email alerts
|

The Helping Hand: An Assistive Manipulation Framework Using Augmented Reality and Tongue-Drive Interfaces

Abstract: A human-in-the-loop system is proposed to enable collaborative manipulation tasks for person with physical disabilities. Studies show that the cognitive burden of subject reduces with increased autonomy of assistive system. Our framework obtains high-level intent from the user to specify manipulation tasks. The system processes sensor input to interpret the user's environment. Augmented reality glasses provide ego-centric visual feedback of the interpretation and summarize robot affordances on a menu. A tongue… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(15 citation statements)
references
References 24 publications
0
13
0
Order By: Relevance
“…The authors in ( Chu et al, 2008 .) used a Tongue Drive System as input for an assistive grasping system facilitated through an AR interface.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The authors in ( Chu et al, 2008 .) used a Tongue Drive System as input for an assistive grasping system facilitated through an AR interface.…”
Section: Discussionmentioning
confidence: 99%
“…The authors in (Chu et al, 2008.) used a Tongue Drive System as input for an assistive grasping system facilitated through an AR interface. The system implements the YOLO neural network [39] for object detection and a deep grasp algorithm for detecting the graspable locations for each object.…”
Section: Robotic Armsmentioning
confidence: 99%
“…Especially when working with static robots, marker based approaches are widely used due to the simple setup and competitive accuracy. Several work including Aoki et al [14] or [15] et al relied on a marker based calibration between robot and AR device. However, Baratoff et al [16] stated, that the instrumentation of the real environment, by placing additional tools like markers, sensors or cameras is a main bottleneck for complex use cases especially in navigation that are working with dynamically changing environments.…”
Section: A Ar Calibration Methodsmentioning
confidence: 99%
“…Furthermore, a combination of sensing modalities, learning methods, and control schemes are used by many researchers to create more efficient HRI systems. Chu et al created a framework that used a tongue-driven interface to obtain the user's intent and verify it visually through an augmented reality headset [46]. Sasaki et al developed a novel tongue interface based on the classification of tongue motions from the surface EMG signals of the suprahyoid muscles detected at the underside of the jaw [47].…”
Section: Assistive Robotic Manipulatorsmentioning
confidence: 99%