2004 IEEE International Conference on Acoustics, Speech, and Signal Processing
DOI: 10.1109/icassp.2004.1327185
|View full text |Cite
|
Sign up to set email alerts
|

Parametric and non-parametric signal analysis for mapping air flow in the ear-canal to tongue movements: a new strategy for hands-free human-machine interfaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Publication Types

Select...
6

Relationship

3
3

Authors

Journals

citations
Cited by 9 publications
(12 citation statements)
references
References 3 publications
0
12
0
Order By: Relevance
“…al. showed that air flow signals caused by the tongue movements in the external auditory canal and collected via a sensitive in-ear microphone device could be mapped to specific tongue movements [6,7]. This earlier study also showed that control commands collected via this in-ear microphone could be used in a man-machine interface for the operation of either a robot or a device for a handicapped person.…”
Section: Introductionmentioning
confidence: 90%
“…al. showed that air flow signals caused by the tongue movements in the external auditory canal and collected via a sensitive in-ear microphone device could be mapped to specific tongue movements [6,7]. This earlier study also showed that control commands collected via this in-ear microphone could be used in a man-machine interface for the operation of either a robot or a device for a handicapped person.…”
Section: Introductionmentioning
confidence: 90%
“…The system makes use of changes in air pressure or sound waves (vibrations) in the ear to characterize measured parameters. Research has shown that initiating actions, in particular movements of the tongue [3,4] and speech, produce detectable pressure waves with strength corresponding to the direction, speed and/or intensity of the action*..…”
Section: Machine Interface Systemmentioning
confidence: 99%
“…In this work, we introduce a method for detecting both tongue movement and speech, and generating a control instruction corresponding to that action that can be applied to any teleoperated or semi-autonomous robot. We have previously reported on the development of a non-intrusive tonguemovement based machine interface without the need for insertion of any device within the oral cavity [3,4]. This interface consists of tracking tongue movement by monitoring changes in airflow that occur in the ear canal.…”
Section: Introductionmentioning
confidence: 99%
“…We have explored, matched filtering, autoregressive modeling, and non-linear alignment methods to determine the signal classes [5]. Non-linear alignment has shown the greatest promise for recognition accuracy, and is thus enumerated for telerobotic applications.…”
Section: B Signal Recognition and Classificationmentioning
confidence: 99%
“…For example, if a "right" tongue movement was performed, a 97.98% probability of the robot receiving this signal was assumed, with a 2.02% probability of the robot receiving a "top" (forward movement) signal instead. Our work [5] has shown an 0.2 second interval typifies nearly all recorded tongue movements, thus this delay was assumed between movement commands. Figure 8 shows the results of a simple simulation where the interface was implemented to direct the robot to reach a series of (20) waypoints in a planar work space.…”
Section: Robotic Performance Simulationmentioning
confidence: 99%