DOI: 10.22215/etd/2014-10565
|View full text |Cite
|
Sign up to set email alerts
|

A Quaternion-Based Motion Tracking and Gesture Recognition System Using Wireless Inertial Sensors

Abstract: This work examines the development of a unified motion tracking and gesture recognition system that functions through worn inertial sensors. The system is comprised of a total of ten wireless sensors and uses their quaternion output to map the player's motions to an onscreen character in real-time. To demonstrate the capabilities of the system, a simple virtual reality game was created. A hierarchical skeletal model was implemented that allows players to navigate the virtual world without the need of a handhel… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(18 citation statements)
references
References 56 publications
0
18
0
Order By: Relevance
“…The previous research on this topic validated the accuracy of this type of gesture classification system using a single sensor as a baseline.It used Hidden Markov Model and Modified Markov Chain [1] [7] for classification. Following up with that, we are posing two main research questions.…”
Section: Research Questionsmentioning
confidence: 99%
See 4 more Smart Citations
“…The previous research on this topic validated the accuracy of this type of gesture classification system using a single sensor as a baseline.It used Hidden Markov Model and Modified Markov Chain [1] [7] for classification. Following up with that, we are posing two main research questions.…”
Section: Research Questionsmentioning
confidence: 99%
“…Each sensor outputs its current orientation in the form of a quaternion, about which we will discuss in the next section. Upon startup, the sensor generates its own global reference frame such that the positive z-axis is oriented vertically upwards and the x and y axes are determined by the device's orientation (figure 5) [7]. After that, the MPU on each sensor takes readings from the accelerometer and the gyroscope and calculates and estimate of the current orientation of the sensor relative to those initial axes.…”
Section: Sensor Orientationmentioning
confidence: 99%
See 3 more Smart Citations