The majority of neuroprosthetic interfaces, linking amputee to prosthetic hand, utilise proportional-based control through electromyography (EMG). The clinical translation of these interfaces can be attributed to their relative simplicity, usually requiring only two EMG electrodes to be placed on the flexor and extensor of the forearm. This bi-electrode setup enables opening and closing of hand grasp with an additional manual input used to cycle through the various grip patterns. In recent literature, the main focus has been on higher degree-offreedom control leading to more complicated interfaces which can be considered the main barrier preventing their clinical utility. As such, new methods for grip pattern switching have not been explored with this fieldable strategy lacking any serious attention. In this work, a novel input, augmenting neuroprosthetic hand control, is proposed. This interface is based on bioacoustic signals generated through prescribed tongue movements. We demonstrate that such an interface can provide comparable performance to existing proportional-based systems without requiring any additional movements of the upper extremities.