Computer graphics and animation, as a direct result of advancements in hardware and software, have become broad and demanding areas of research. Animation of human motion is a major component in the field that has attracted many due to its significance in movies, games, and virtual environments. We propose that processing features for style and affect, which are fundamental determinants of personality and naturally appearing motion, should be carried out through perceptually guided processing techniques. In this dissertation, we employ this approach and develop a set of tools for extraction, synthesis, and analysis of affective and stylistic motion features.Temporal alignment is one of the most common issues in processing motion data. Accordingly, we first propose a new time warping technique for motion. The proposed method outperforms several existing techniques and has advantages such as precise alignment, low distortion, smooth warped motion trajectories, and high customizability.Many motion processing techniques utilize incremental (joint-to-joint) processing of motion sequences. In addition, some systems process only selected joints or regions of the body. Hence, it is imperative to verify whether partial or subsets of computational solutions can lead to perceptually accurate results. Accordingly, we investigate and validate the notion of additivity in perception of affect from motion.A system capable of extracting style/affect features from motion data using spline optimization is then introduced. Our method has several advantages over existing techniques, namely extracting the features as three separate movement, posture, and time components, which are the perceptual and functional sources for stylistic/affective motion. Our method also performs in Cartesian or joint-angle spaces rather than ii Eigen/latent sub-spaces which is the common trend in existing techniques.Towards synthesis of style/affect features, a perception-based expert-driven approach is used. Gaussian radial basis functions (RBFs) are first introduced as mathematical constructs for stylistic/affective features. A user interface is then developed using which animators can utilize these basis functions to synthesize the desired stylistic/affective features. Through analysis and in depth study of data collected from several animators, expert-driven perceptual shortcuts for generation of different stylistic/affective themes are derived. The features also shed light on various aspects of execution and perception of style/affect. A unified system capable of both classification and translation of stylistic/affective features in motion is subsequently developed using ensembles of Gaussian RBF neural networks. The recognition module of the system outperforms several other classifiers and the style translation module produces results that are validated as perceptually accurate by viewers. Finally, to provide a set of guidelines for animators, based on which stylistic/affective features can be modified or added to motion data, an empirical paradigm i...