This work examines the development of a unified motion tracking and gesture recognition system that functions through worn inertial sensors. The system is comprised of a total of ten wireless sensors and uses their quaternion output to map the player's motions to an onscreen character in real-time. To demonstrate the capabilities of the system, a simple virtual reality game was created. A hierarchical skeletal model was implemented that allows players to navigate the virtual world without the need of a handheld controller. In addition to motion tracking, the system was also tested for its potential for gesture recognition. A sensor on the right forearm was used to test six different gestures, each with 500 training samples. Despite the widespread use of Hidden Markov Models for recognition, our modified Markov Chain algorithm obtained higher average accuracies at 95%, as well as faster computation times. This makes it an ideal candidate for use in real time applications. Combining motion tracking and dynamic gesture recognition into a single unified system is unique in the literature and comes at a time when virtual reality and wearable computing are emerging in the marketplace.iii