Our goal is to establish a simple baseline method for human identification based on body shape and gait. This baseline recognition method provides a lower bound against which to evaluate more complicated procedures. We present a viewpoint dependent technique based on template matching of body silhouettes. Cyclic gait analysis is performed to extract key frames from a test sequence. These frames are compared to training frames using normalized correlation, and subject classification is performed by nearest neighbor matching among correlation scores. The approach implicitly captures biometric shape cues such as body height, width, and body-part proportions, as well as gait cues such as stride length and amount of arm swing. We evaluate the method on four databases with varying viewing angles, background conditions (indoors and outdoors), walk styles and pixels on target.Keywords biometrics (access control), computer vision, correlation methods, gait analysis, image classification, image matching, image motion analysis, image sequences, shape measurement, video databases, arm swing, background conditions, baseline recognition method, biometric shape cues, body height, body shape, body width, body-part proportions, correlation scores, cyclic gait analysis, databases, gait cues, indoor conditions, key frame extraction, lower bound, nearest-neighbor matching, normalized correlation, outdoor conditions, pixels, silhouette-based human identification, stride length, subject classification, template matching, test image sequence, training frames, viewing angles, viewpoint-dependent technique, walking styles This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of the University of Pennsylvania's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to pubs-permissions@ieee.org. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.