Inexisting interfaces such as Digital pen which capture only one kind of gesture and in proposed system interfaces it brings a new interaction process. It can recognize both small scale and large scale gestures. An algorithmic framework is for information about Acceleration and Surface Electromyographical (SEMG) signals for gesture identification. In addition, the system consist of a wearable gesture sensing device and an application program along with the algorithmic framework for a mobile phone, will be designed to realize gesture-based real-time interaction. The device which is worn on the forearm, using that device user will be able to manipulate a mobile phone using predefined gestures or even personalized ones. In this system Accelerometer Sensor measure large scale gesture and SEMG sensors measure the small scale gesture. Both this sensors are used for the gesture recognition.
This paper presents head and hand gestures recognition system for Human Computer Interaction (HCI). Head and Hand gestures are an important modality for human computer interaction. Vision based recognition system can give computers the capability of understanding and responding to the hand and head gestures. The aim of this paper is the proposal of real time vision system for its application within a multimedia interaction environment. This recognition system consists of four modules, i.e. capturing the image, image extraction, pattern matching and command determination. If hand and head gestures are shown in front of the camera, hardware will perform respective action. Gestures are matched with the stored database of gestures using pattern matching. Corresponding to matched gesture, the hardware is moved in left, right, forward and backward directions. An algorithm for optimizing connected component in gesture recognition is proposed, which makes use of segmentation in two images. Connected component algorithm scans an image and group its pixels into component based on pixel connectivity i.e. all pixels in connected component share similar pixel intensity values and are in some way connected with each other. Once all groups have been determined, each pixel is labeled with a color according to component it was assigned to.
Natural and friendly interface is critical for the development of service robots. Gesture-based interface offers a way to enable untrained users to interact with robots more easily and efficiently. Robots are, or soon will be, used in such critical domains as search and rescue, military battle, mine and bomb detection, scientific exploration, law enforcement, and hospital care. Such robots must coordinate their behaviors with the requirements and expectations of human team members; they are more than mere tools but rather quasi-team members whose tasks have to be integrated with those of humans. In this sense, this paper goal is to present a method which is used for Posture recognition technique applied in ubiquitous computing and used the efficient bio inspired posture recognition algorithm for our proposed scheme. Here we present a scheme which reduces the size of the database which is used to store different postures of human beings which are used by robot as commands. The picture frame may divide into different scan lines and pixel color value under these scan lines are examined to guess the particular posture of user. The robot may use this as command and act accordingly.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.