We present M[eye]cro an interaction technique to select on-screen objects and navigate menus through the synergistic use of eye-gaze and thumb-to-finger microgestures. Thumb-to-finger microgestures are gestures performed with the thumb of a hand onto the fingers of the same hand. The active body of research on microgestures highlights expected properties including speed, availability and eye-free interaction. Such properties make microgestures a good candidate for multitasking. However, while praised, the state-of-the-art hypothesis stating that microgestures could be beneficial for multitasking has never been quantitatively verified. We study and compare M[eye]cro to a baseline, i.e., a technique based on physical controllers, in a cockpit-based context. This context allows us to design a controlled experiment involving multitasking with low- and high-priority tasks in parallel. Our results show that performances of the two techniques are similar when participants only perform the selection task. However, M[eye]cro tends to yield better time performance when participants additionally need to treat high-priority tasks in parallel. Results also show that M[eye]cro induces less fatigue and is mostly preferred.