.Abstract -We introduce an unobtrusive sensor-based control system for human-machine interface to control robotic and rehabilitative devices. The interface is capable of directing robotic or assist devices in response to tongue movement and/or speech without insertion of any device in the vicinity of the oral cavity. The interface is centered on the unique properties of the human ear as an acoustic output device.Our work has shown that various movements within the oral cavity create unique, traceable pressure changes in the human ear, which can be measured with a simple sensor (such as a microphone) and analysed to produce commands signals, which can in turn be used to control robotic devices.In this work, we present: 1) an analysis of the sensitivity of human ear canals as acoustic output device, 2) the design of a new sensor for monitoring airflow in the aural canal, 3) pattern recognition procedures for recognition of both speech and tongue movement by monitoring aural flow across several human test subjects, and 4) a conceptual design and simulation of the machine interface system.