Interaction for Handheld Augmented Reality (HAR) is a challenging research topic because of the small screen display and limited input options. Although 2D touch screen input is widely used, 3D gesture interaction is a suggested alternative input method. Recent 3D gesture interaction research mainly focuses on using RGB-Depth cameras to detect the spatial position and pose of fingers, using this data for virtual object manipulations in the AR scene. In this paper we review previous 3D gesture research on handheld interaction metaphors for HAR. We present their novelties as well as limitations, and discuss future research directions of 3D gesture interaction for HAR. Our results indicate that 3D gesture input on HAR is a potential interaction method for assisting a user in many tasks such as in education, urban simulation and 3D games.
Augmented Reality (AR) is the technology that augments the real world by using virtual information of 3D objects overlaid on a view of the real environment. This will create an immersive and intuitive experience. As handheld devices are widely used and now has climb with increasing demand of specifications, its interaction based on touch is a natural and appealing input style for the application of AR. Furthermore, most heuristic studies of interaction in AR usually focus on interactions with AR Target close to the user, generally within arm’s reach. As the user’s move farther away, the effectiveness and usability of the interaction modalities may be different. This study explores handheld AR interaction using real hand gesture at a distance in a room-scale setup. Our aim is to investigate the effectiveness of performing selection while being far away from the 3D object and performing selection task towards object that is being occluded. As object manipulation is one of the key features to explore for mobile handheld AR, we hope our study can give some contribution towards its practical use, especially for real-world assembly structure. Hence, our paper proposes finger ray interaction technique for real hand in handheld AR interface to work for selecting objects at distances ranging from 3 feet to 8 feet and when the object being occluded ranging from 20% to 80% occluded.
Augmented Reality (AR) manages to bring a virtual environment into a real-world environment seamlessly. As AR has been recognised as advancing technology, AR brings future changes to the learning process. The goal of this study is to use freehand gestures to create a virtual block game in AR. First of all, the stages of this study are to explore block games and freehand movements by using Leap Motion. Secondly, the design and development of Leap Motion virtual block games, and thirdly, the implementation of free-hand gesture interaction virtual block games. The paper explains about virtual blocks AR game using freehand gesture. AR tracking system with real hand gesture recognition system is merged to execute the freehand gesture. A prototype virtual block has been described in this paper. The paper ends with the conclusion and future works.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.