International audienceThis paper deals with the design and the evaluation of human-like robot movements. Three criteria were proposed and evaluated regarding their impact on the human-likeness of the robot movements: The inertia of the base, the inertia of the end-effector and the velocity profile. A specific tool was designed to generate different levels of anthropomorphism according to these three parameters. An industrial use case was designed to compare several robot movements. This use case was implemented with a virtual robot arm in a virtual environment, using virtual reality. A user study was conducted to determine what were the important criteria in the perception of human-like robot movements and what were their correlations with other notions such as safety and preference. The results showed that inertia on the end-effector was of most importance for a movement to be perceived as human-like and nonaggressive, and that those characteristics helped the users feel safer, less stressed and more willing to work with the robot
We present a fully procedural method capable of generating in real-time a wide range of locomotion for multi-legged characters in a dynamic environment, without using any motion data. The system consists of several independent blocks: a Character Controller, a Gait/Tempo Manager, a 3D Path Constructor and a Footprints Planner. The four modules work cooperatively to calculate in real-time the footprints and the 3D trajectories of the feet and the pelvis. Our system can animate dozens of creatures using dedicated level of details (LOD) techniques, and is totally controllable allowing the user to design a multitude of locomotion styles through a user-friendly interface. The result is a complete lower body animation which is sufficient for most of the chosen multi-legged characters: arachnids, insects, imaginary n-legged robots, etc.
We propose a system capable in real time of adding controllable and plausible oscillating physical like reaction effects in response to external forces (perturbations). These oscillating effects may be used to modify a motion or to customize it in a cartoon like way. The core of our system is based on several connected 3D pendulums with a propagating reaction. These pendulums always return to a preferred direction that can be fixed in advance or can be modified during the motion by external predefined data (such as keyframe). Our pendulums are fully controllable, concerning reaction time and damping, and the results are completely deterministic. They are easy to implement, even without any prior knowledge of physical simulations. Our system is applicable on articulated body with predefined motion data (manually set or captured) or procedural animation.
Many areas in computer science are facing the need to analyze, quantify and reproduce movements expressing emotions. This paper presents a systematic review of the intelligible factors involved in the expression of emotions in human movement and posture. We have gathered the works that have studied and tried to identify these factors by sweeping many disciplinary fields such as psychology, biomechanics, choreography, robotics and computer vision. These researches have each used their own definitions, units and emotions, which prevents a global and coherent vision. We propose a meta-analysis approach that cross-references and aggregates these researches in order to have a unified list of expressive factors quantified for each emotion. A calculation method is then proposed for each of the expressive factors and we extract them from an emotionally annotated animation dataset: Emilya. The comparison between the results of the meta-analysis and the Emilya analysis reveals high correlation rates, which validates the relevance of the quantified values obtained by both methodologies. The analysis of the results raises interesting perspectives for future research in affective computing.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.