When animating virtual humans for real-time applications such as games and virtual reality, animation systems often have to edit motions in order to be responsive. In many cases, contacts between the feet and the ground are not (or cannot be) properly enforced, resulting in a disturbing artifact know as footsliding or footskate. In this paper, we explore the perceptibility of this error and show that participants can perceive even very low levels of footsliding (<21mm in most conditions). We then explore the visual fidelity of animations where footskate has been cleaned up using two different methods. We found that corrected animations were always preferred to those with footsliding, irrespective of the extent of the correction required. We also determined that a simple approach of lengthening limbs was preferred to a more complex approach using IK fixes and trajectory smoothing.
Figure 1: The experiment stimuli and their creation -the stimuli as presented to the subject (top) and their creation, with trajectories and characters coloured according to their respective animations (bottom). In the depicted trial, the gold standard is shown on left. Figure 2: Trajectory optimisation -random trajectories result in a large number of intersections (1), causing many collisions (2). Our optimisation approach ensures a collision-free scenario (3, 4). AbstractIn order to simulate plausible groups or crowds of virtual characters, it is important to ensure that the individuals in a crowd do not look, move, behave or sound identical to each other. Such obvious 'cloning' can be disconcerting and reduce the engagement of the viewer with an animated movie, virtual environment or game. In this paper, we focus in particular on the problem of motion cloning, i.e., where the motion from one person is used to animate more than one virtual character model. Using our database of motions captured from 83 actors (45M and 38F), we present an experimental framework for evaluating human motion, which allows both the static (e.g., skeletal structure) and dynamic aspects (e.g., walking style) of an animation to be controlled. This framework enables the creation of crowd scenarios using captured human motions, thereby generating simulations similar to those found in commercial games and movies, while allowing full control over the parameters that affect the perceived variety of the individual motions in a crowd. We use the framework to perform an experiment on the perception of characteristic walking motions in a crowd, and conclude that the minimum number of individual motions needed for a crowd to look varied could be as low as three. While the focus of this paper was on the dynamic aspects of animation, our framework is general enough to be used to explore a much wider range of factors that affect the perception of characteristic human motion.
Graphics, Vision and Visualisation Group, Trinity College Dublin-1.6 -1.2 -0.8 -0.4 0 0.4 0.8 1.2 1.6 Motion speed change [m/s] 100 50 0 Timewarp detection accuracy [%] (a) (b) (c) Figure 1: (a) screenshot from running experiment (side view, stick figure); (b) stick figure and geometrical model used in the study; (c) experiment results in absolute differences in speed of the locomotion, showing that speeding up the motion produces severe perceptual artifacts while event significant slow down is perceptually acceptable. AbstractUnderstanding the perception of humanoid character motion can provide insights that will enable realism, accuracy, computational cost and data storage space to be optimally balanced. In this sketch we describe a preliminary perceptual evaluation of human motion timewarping, a common editing method for motion capture data. During the experiment, participants were shown pairs of walking motion clips, both timewarped and at their original speed, and asked to identify the real animation. We found a statistically significant difference between speeding up and slowing down, which shows that displaying clips at higher speeds produces obvious artifacts, whereas even significant reductions in speed were perceptually acceptable. Computer Animation and PerceptionComputer animation and motion perception are closely related fields, as the result of motion synthesis is always presented to a live observer. In this study, we focus on motion timewarping, a method used in both parametric models and state machines for matching and transitioning between clips of the same type with different timing or speed. Timewarping, and particularly dynamic timewarping, originates in speech recognition and was succesfully used, alongside other signal processing techniques, on animation data (e.g., Bruderlin and Williams [1995]). However the perceptual implications of such manipulations have not been extensively studied and are hard to predict. Our evaluation approach is closely related to the work of Reitsma and Pollard [2003], but we focus on human locomotions rather than generic ballistic motion. Experiment DesignFive motion captured clips of a walking animation served as the stimuli for our experiment. These five animation speeds covered a normal range of human walking, ranging from 0.8 m/s to 2.4 m/s with 0.4 m/s increments. For each clip, we created 4 other versions using timewarping to match the speed of the other clips, leading to a total of 25 clips (e.g., the 1.2 m/s motion was slowed down to 0.8 m/s and speeded up to 1.6, 2.0 and 2.4 m/s). We hypothesised that timewarping would be less noticeable if the timewarped speed is close to the original speed of the clip. The experiment consisted of sequences depicting two animated characters side-by-side (Figure 1). Both characters (Figure 1b) were either stick figures or geometric models (the model's realism was found to affect perceptual sensitivity to errors in motion [Hodgins et al. 1998]), facing forward or sideways (to test if motion error sensitivity is affected b...
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.