The
human–machine interface (HMI) previously relied on a
single perception interface that cannot realize three-dimensional
(3D) interaction and convenient and accurate interaction in multiple
scenes. Here, we propose a collaborative interface including electrooculography
(EOG) and tactile perception for fast and accurate 3D human–machine
interaction. The EOG signals are mainly used for fast, convenient,
and contactless 2D (XY-axis) interaction, and the
tactile sensing interface is mainly utilized for complex 2D movement
control and Z-axis control in the 3D interaction.
The honeycomb graphene electrodes for the EOG signal acquisition and
tactile sensing array are prepared by a laser-induced process. Two
pairs of ultrathin and breathable honeycomb graphene electrodes are
attached around the eyes for monitoring nine different eye movements.
A machine learning algorithm is designed to train and classify the
nine different eye movements with an average prediction accuracy of
92.6%. Furthermore, an ultrathin (90 μm), stretchable (∼1000%),
and flexible tactile sensing interface assembled by a pair of 4 ×
4 planar electrode arrays is attached to the arm for 2D movement control
and Z-axis interaction, which can realize single-point,
multipoint and sliding touch functions. Consequently, the tactile
sensing interface can achieve
eight directions control and even more complex movement trajectory
control. Meanwhile, the flexible and ultrathin tactile sensor exhibits
an ultrahigh sensitivity of 1.428 kPa–1 in the pressure
range 0–300 Pa with long-term response stability and repeatability.
Therefore, the collaboration between EOG and the tactile perception
interface will play an important role in rapid and accurate 3D human–machine
interaction.