We describe the research and integration methods we developed to give the HRP-2 humanoid robot the capability to climb vertical industrial-norm ladders. Our approach makes use of our multi-contact planner and multiobjective closed-loop control formulated as a QP (quadratic program). First, a set of contacts to climb the ladder is planned off-line (automatically or by the user). These contacts are provided as an input for a finite state machine. The latter builds additional intermediary tasks accounting for geometric uncertainties and specific grasps procedures to be realized by our multi-objective model-based QP controller. This controller provides instant desired states in terms of joint accelerations and contact forces to be tracked by the embedded low-level motor controllers. Our trials revealed that hardware changes are to be made on the HRP-2, and parts of software are to be made more robust. Yet, we confirmed that HRP-2 has the kinematic and power capabilities to climb real industrial ladders, which can be found in nuclear power plants and large scale manufacturing such as shipyards, aircraft factories and construction sites.Keywords Humanoid robots · multi-contact motion planning and control · field humanoid robots · disaster humanoid robots Notice that : (i) it is not possible to put two feet on a same rung (ii) closed grippers do not grab firmly the rungs (iii) each foot can be freely positioned on each rung: the right foot is rotated to increase the reaching range of the left arm toward the higher rung.
We propose a framework for combining vision and haptic information in human-robot joint actions. It consists of a hybrid controller that uses both visual servoing and impedance controllers. This can be applied to tasks that cannot be done with vision or haptic information alone. In this framework, the state of the task can be obtained from visual information while haptic information is crucial for safe physical interaction with the human partner. The approach is validated on the task of jointly carrying a flat surface (e.g. a table) and then preventing an object (e.g. a ball) on top from falling off. The results show that this task can be successfully achieved. Furthermore, the framework presented allows for a more collaborative setup, by imparting task knowledge to the robot as opposed to a passive follower.
Abstract-In this paper, we propose a control scheme that allows a humanoid robot to perform a complex transportation scenario jointly with a human partner. At first, the robot guesses the human partner's intentions to proactively participate to the task. In a second phase, the human-robot dyad switches roles: the robot takes over the leadership of the task to complete the scenario. During this last phase, the robot is remotely controlled with a joystick. The scenario is realized on a real HRP-2 humanoid robot to assess the overall approach.
We report results from a collaborative project that investigated the deployment of humanoid robotic solutions in aircraft manufacturing for some assembly operations where access is not possible for wheeled or rail-ported robotic platforms. Recent developments in multi-contact planning and control, bipedal walking, embedded SLAM, whole-body multi-sensory task space optimization control, and contact detection and safety, suggest that humanoids could be a plausible solution for automation given the specific requirements in such large-scale manufacturing sites. The main challenge is to integrate these scientific and technological advances into two existing humanoid platforms: the position controlled HRP-4 and the torque controlled TORO. This integration effort was demonstrated in a bracket assembly operation inside a 1:1 scale A350 mock-up of the front part of the fuselage at the Airbus Saint-Nazaire site. We present and discuss the main results that have been achieved in this project and provide recommendations for future work.
Advancement in brain computer interfaces (BCI) technology allows people to actively interact in the world through surrogates. Controlling real humanoid robots using BCI as intuitively as we control our body represents a challenge for current research in robotics and neuroscience. In order to successfully interact with the environment the brain integrates multiple sensory cues to form a coherent representation of the world. Cognitive neuroscience studies demonstrate that multisensory integration may imply a gain with respect to a single modality and ultimately improve the overall sensorimotor performance. For example, reactivity to simultaneous visual and auditory stimuli may be higher than to the sum of the same stimuli delivered in isolation or in temporal sequence. Yet, knowledge about whether audio-visual integration may improve the control of a surrogate is meager. To explore this issue, we provided human footstep sounds as audio feedback to BCI users while controlling a humanoid robot. Participants were asked to steer their robot surrogate and perform a pick-and-place task through BCI-SSVEPs. We found that audio-visual synchrony between footsteps sound and actual humanoid's walk reduces the time required for steering the robot. Thus, auditory feedback congruent with the humanoid actions may improve motor decisions of the BCI's user and help in the feeling of control over it. Our results shed light on the possibility to increase robot's control through the combination of multisensory feedback to a BCI user.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.