This paper describes a video analysis system, free of markers and setup procedures, that quantitatively identified gait abnormalities in real time from standard video images. A novel color three-dimensional body model was sized and texture mapped to the exact characteristics of a person from video images. The kinematics of the body model was represented by a transformation tree to track the position and orientation of a person relative to the camera. Joint angles were used to track the location and orientation of each body part, with the range of joint angles being constrained by associating degrees of freedom with each joint. To stabilize tracking, the joint angles were estimated for the next frame. The calculation of joint angles, for the next frame, was cast as an estimation problem, which was solved using an iterated extended Kalman filter. Patients with dopa-responsive Parkinsonism, and age-matched normals, were video taped during several gait cycles with walking movements successfully tracked and classified. The results suggested that this approach has the potential to guide clinicians on the relative sensitivity of specific postural/gait features in diagnosis.