<p>Eye gaze is potentially fast and ergonomic for target selection in AR. Nevertheless, it is reported to be inaccurate. To compensate for its low accuracy in selecting targets in an AR menu, previous researchers proposed dividing a menu into several sub-menus where targets are arranged regularly, and mapping the one pointed by eye gaze to a Google Glass’ touchpad on which the user confirms the selection of a target in the sub-menu via swipe or tap. However, whether this technique was effective in enhancing the basic gaze-touch target selection was not investigated. We coined the terminology TangibleCounterpart for this technique's essence and suggested using a cellphone touchscreen as the “tangible counterpart” for sub-menus. Further, we proposed the design space of the cellphone-based TangibleCounterpart concept that includes three dimensions, i.e., sub-menu size, the way a user holds the cellphone, and the touch technique for selection confirmation. Our empirical study showed that only when participants used two thumbs for selection confirmation (TH) and the sub-menu size was 2 rows * 1 column (2R1C) or 2 rows * 2 columns (2R2C), the error rates were significantly smaller than that under the basic gaze-touch target selection; and speeds were not compromised. The best performance (error rate: mean = 4%, SD = 2.6%; completion time: mean = 1.2s, SD = 0.2s) was achieved when participants used tapping for selection confirmation under TH and 2R2C. </p>