2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566)
DOI: 10.1109/iros.2004.1389460
|View full text |Cite
|
Sign up to set email alerts
|

Human-machine interface for tele-robotic operation: mapping of tongue movements based on aural flow monitoring

Abstract: Abstract-A new human-machine interface is introduced for "hands-free" tele-operation of mobile robots. This interface consists of tracking tongue movement by monitoring changes in airflow that occur in the ear canal. Tongue movements within the human oral cavity create unique, subtle pressure signals in the ear that can be processed to produce commands signals in response to that movement. Once recognized, said movements can in turn be used in for robotic tele-operation. The complete strategy is tested on 4 to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Publication Types

Select...
3
2

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 7 publications
0
8
0
Order By: Relevance
“…al. showed that air flow signals caused by the tongue movements in the external auditory canal and collected via a sensitive in-ear microphone device could be mapped to specific tongue movements [6,7]. This earlier study also showed that control commands collected via this in-ear microphone could be used in a man-machine interface for the operation of either a robot or a device for a handicapped person.…”
Section: Introductionmentioning
confidence: 89%
“…al. showed that air flow signals caused by the tongue movements in the external auditory canal and collected via a sensitive in-ear microphone device could be mapped to specific tongue movements [6,7]. This earlier study also showed that control commands collected via this in-ear microphone could be used in a man-machine interface for the operation of either a robot or a device for a handicapped person.…”
Section: Introductionmentioning
confidence: 89%
“…The system makes use of changes in air pressure or sound waves (vibrations) in the ear to characterize measured parameters. Research has shown that initiating actions, in particular movements of the tongue [3,4] and speech, produce detectable pressure waves with strength corresponding to the direction, speed and/or intensity of the action*..…”
Section: Machine Interface Systemmentioning
confidence: 99%
“…In this work, we introduce a method for detecting both tongue movement and speech, and generating a control instruction corresponding to that action that can be applied to any teleoperated or semi-autonomous robot. We have previously reported on the development of a non-intrusive tonguemovement based machine interface without the need for insertion of any device within the oral cavity [3,4]. This interface consists of tracking tongue movement by monitoring changes in airflow that occur in the ear canal.…”
Section: Introductionmentioning
confidence: 99%
“…In past work [1,2] we have introduced a non-intrusive tongue-movement based machine interface without the need for insertion of any device within the oral cavity. This interface consists of tracking tongue movement by monitoring changes in airflow that occur in the ear canal.…”
Section: Introductionmentioning
confidence: 99%
“…Once recognized, said movements can in turn be used in for robotic tele-operation. While previous studies by our research team [1,2] have demonstrated the feasibility of recognizing tongue movements by monitoring flow in the aural cavity, a specific comparison of signal processing and pattern recognition strategies for this unique interface across several test subjects is necessary to begin to properly quantify the phenomenon.…”
Section: Introductionmentioning
confidence: 99%