We have implemented an augmented reality videoconferencing system that inserts virtual graphics overlays into the live video stream of remote conference participants. The virtual objects are manipulated using a novel interaction technique cascading bimanual tangible interaction and eye tracking. User studies prove that our user interface enriches remote collaboration by offering hitherto unexplored ways for collaborative object manipulation such as gaze controlled raypicking of remote physical and virtual objects. We have created an augmented reality-based videoconferencing tool that allows users to discuss and manipulate real and virtual objects over great distances while preserving non-verbal communication and part of the conference parties' physical environment. We have experimented with user interface techniques that make communication and interaction smoother while discussing real and virtual objects with a remote videoconference party. Physical objects are pose-tracked with handheld fiducial markers and virtual objects are assigned to tracked physical placeholders. In our application scenario videoconference parties use both hands to carry out object manipulation and interaction tasks such as translation, rotation or selection with fiducial markers. While two-handed tangible interaction may rely solely on complex 2D and 3D gestures aided by traditional input devices such as mouse and keyboard, we have found that exploiting the human eye as a natural input device during bimanual object manipulation yields faster, richer and more intuitive communication between partners in remote collaboration tasks. To support this statement, we have implemented and evaluated an interaction technique cascading bimanual tangible interaction and eye tracking. Figure 1 shows a schema of our AR videoconferencing system enhanced by eye tracking. This paper first discusses how tangible augmented reality and non-intrusive eye tracking enhance remote collaboration while comparing our system with related work, then presents our application scenario and interaction technique with implementation details. We conclude our paper with the results of our user study.
REMOTE COLLABORATION IN ARUsers of AR applications can see the real world, which provides a reference frame for their actions. They can see themselves and their collaborators, enabling smooth communication with nonverbal cues during collaborative work. Moreover, a virtual space with synthetic objects is aligned with and superimposed onto the real world and shared among the users. Thus changes made to manipulated objects during the collaborative session are distributed and immediately visible to all participants. Unfortunately, this form of shared AR requires that the collaborators are sharing the same physical space, making it incompatible with remote collaboration over greater distances. State-of-the-art remote collaboration tools include audio/video conferencing and application sharing to help bridge the distance by displaying the remote parties' real environments. Ap...