Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for tracking and grasping a moving object. The focus of our work is to achieve a high level of interaction between a real-time vision system capable of tracking moving objects in 3-D and a robot arm with gripper that can be used to pick up a moving object. There is an interest in exploring the interplay of hand-eye coordination for dynamic grasping tasks such as grasping of parts on a moving conveyor system, assembly of articulated parts, or for grasping from a mobile robotic system. Coordination between an organism's sensing modalities and motor control system is a hallmark of intelligent behavior, and we are pursuing the goal of building an integrated sensing and actuation system that can operate in dynamic as opposed to static environments. The system we have built addresses three distinct problems in robotic hand-eye coordination for grasping moving objects: fast computation of 3-D motion parameters from vision, predictive control of a moving robotic arm to track a moving object, and interception and grasping. The system is able to operate at approximately human arm movement rates, and experimental results in which a moving model train is tracked is presented, stably grasped, and picked up by the system. The algorithms we have developed that relate sensing to actuation are quite general and applicable to a variety of complex robotic tasks that require visual feedback for arm and hand control.
This paper describes a new real-time tracking algorithm in conjunction with a predictive filter to allow real-time visual servoing of a robotic arm that is following a moving object. The system consists of two calibrated (but unregistered) cameras that provide images to a real-time, pipelined-parallel optic-flow algorithm that can robustly compute optic-flow and calculate the 3-D position of a moving object at approximately 5 Hz rates. These 3-D positions of the moving object serve as input to a predictive kinematic control algorithm that uses an α − β − γ filter to update the position of a robotic arm tracking the moving object. Experimental results are presented for the tracking of a moving model train in a variety of different trajectories.
SUMMARYThis paper presents a new program package for the generation of efficient manipulator kinematic and dynamic equations in symbolic form.The basic algorithm belongs to the class of customized algorithms that reduce the computational burden by taking into account the specific characteristics of the manipulator to be modelled. The output of the package is high-level computer program code for evaluation of various kinematic and dynamic variables: the homogeneous transformation matrix between the hand and base coordinate frame, Jacobian matrices, driving torques and the elements of dynamic model matrices. The dynamic model is based on the recursive Newton-Euler equations. The application of recursive symbolic relations yields nearly minimal numerical complexity. Further improvement of computational efficiency is achieved by introducing different computational rates for the terms depending on joint angles, velocities and accelerations. A comparative study of numerical complexity for several typical industrial robots is presented.
Most robotic grasping tasks assume a stationary or fixed object. In this paper, we explore the requirements for grasping a moving object. This task requires proper coordination between at least 3 separate subsystems: real-time vision sensing, trajectoryplanning/-control, and grasp planning. As with humans, our system first visually tracks the object's 3-D position. Because the object is in motion, this must be done in real-time to coordinate the motion of the robotic arm as it tracks the object. The vision system is used to feed an arm control algorithm that plans a trajectory. The arm control algorithm is implemented in two steps: 1) filtering and prediction, and 2) kinematic transformation computation. Once the trajectory of the object is tracked, the hand must intercept the object to actually grasp it. We present experimental results in which a moving model train is tracked, stably grasped, and picked up by the system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.