The paper investigates the problem of controlling the speed of robots in collaborative workcells for automated manufacturing. The solution is tailored to robotic cells for cooperative assembly of aircraft fuselage panels, where only structural elements are present and robots and humans can share the same workspace, but no physical contact is allowed, unless it happens at zero robot speed. The proposed approach addresses the problem of satisfying the minimal set of requirements of an industrial Human-Robot Collaboration (HRC) task: precision and reliability of human detection and tracking in the shared workspace; correct robot task execution with minimum cycle time while assuring safety for human operators. These requirements are often conflicting with each other. The former does not concern with safety only but also with the need of avoiding unnecessary robot stops or slowdowns in case of false positive human detection. The latter, according to the current regulations, concerns with the need of computing the minimum protective separation distance between the human operator and the robots by adjusting their speed when dangerous situations happen. This paper proposes a novel fuzzy inference approach to control robot speed enforcing safety while maximizing the level of productivity of the robot minimizing cycle time as well. The approach is supported by a sensor fusion algorithm which merges the images acquired from different depth sensors with those obtained from a thermal camera, by using a machine learning approach. The methodology is experimentally validated in two experiments, the first one at lab-scale and the second one performed on a fullscale robotic workcell for cooperative assembly of aeronautical structural parts.Note to Practitioners-The paper discusses a way to handle human safety specifications vs. production requirements in collaborative robotized assembly systems. State-of-Art (SoA) approaches cover only a few aspects of both human detection and robot speed scaling. The present research work proposes a complete pipeline which starts from a robust human tracking algorithm and scales the robot speed in real time. An innovative multimodal perception system composed of two depth cameras and a thermal camera monitors the collaborative workspace. The speed scaling algorithm is optimized to take on different human behaviors during less risky situations or more dangerous ones to guarantee both operator safety and minimum production time with the aim of better profitability and efficiency for collaborative workstations. The algorithm estimates the operator intention for real-time computation of the minimum protective distance according to the current safety regulations. The robot speed is smoothly changed for psychological advantages of operators, both in case of single and multiple workers. The result is a complete system, easily implementable on a standard industrial workcell.
This paper presents the design and calibration of a new force/tactile sensor for robotic applications. The sensor is suitably designed to provide the robotic grasping device with a sensory system mimicking the human sense of touch, namely, a device sensitive to contact forces, object slip and object geometry. This type of perception information is of paramount importance not only in dexterous manipulation but even in simple grasping tasks, especially when objects are fragile, such that only a minimum amount of grasping force can be applied to hold the object without damaging it. Moreover, sensing only forces and not moments can be very limiting to securely grasp an object when it is grasped far from its center of gravity. Therefore, the perception of torsional moments is a key requirement of the designed sensor. Furthermore, the sensor is also the mechanical interface between the gripper and the manipulated object, therefore its design should consider also the requirements for a correct holding of the object. The most relevant of such requirements is the necessity to hold a torsional moment, therefore a soft distributed contact is necessary. The presence of a soft contact poses a number of challenges in the calibration of the sensor, and that is another contribution of this work. Experimental validation is provided in real grasping tasks with two sensors mounted on an industrial gripper.
Tactile data perception is of paramount importance in today’s robotics applications. This paper describes the latest design of the tactile sensor developed in our laboratory. Both the hardware and firmware concepts are reported in detail in order to allow the research community the sensor reproduction, also according to their needs. The sensor is based on optoelectronic technology and the pad shape can be adapted to various robotics applications. A flat surface, as the one proposed in this paper, can be well exploited if the object sizes are smaller than the pad and/or the shape recognition is needed, while a domed pad can be used to manipulate bigger objects. Compared to the previous version, the novel tactile sensor has a larger sensing area and a more robust electronic, mechanical and software design that yields less noise and higher flexibility. The proposed design exploits standard PCB manufacturing processes and advanced but now commercial 3D printing processes for the realization of all components. A GitHub repository has been prepared with all files needed to allow the reproduction of the sensor for the interested reader. The whole sensor has been tested with a maximum load equal to 15N, by showing a sensitivity equal to 0.018V/N. Moreover, a complete and detailed characterization for the single taxel and the whole pad is reported to show the potentialities of the sensor also in terms of response time, repeatability, hysteresis and signal to noise ratio.
Modern scenarios in robotics involve human-robot collaboration or robot-robot cooperation in unstructured environments. In human-robot collaboration, the objective is to relieve humans from repetitive and wearing tasks. This is the case of a retail store, where the robot could help a clerk to refill a shelf or an elderly customer to pick an item from an uncomfortable location. In robot-robot cooperation, automated logistics scenarios, such as warehouses, distribution centers and supermarkets, often require repetitive and sequential pick and place tasks that can be executed more efficiently by exchanging objects between robots, provided that they are endowed with object handover ability. Use of a robot for passing objects is justified only if the handover operation is sufficiently intuitive for the involved humans, fluid and natural, with a speed comparable to that typical of a human-human object exchange. The approach proposed in this paper strongly relies on visual and haptic perception combined with suitable algorithms for controlling both robot motion, to allow the robot to adapt to human behavior, and grip force, to ensure a safe handover. The control strategy combines model-based reactive control methods with an event-driven state machine encoding a human-inspired behavior during a handover task, which involves both linear and torsional loads, without requiring explicit learning from human demonstration. Experiments in a supermarket-like environment with humans and robots communicating only through haptic cues demonstrate the relevance of force/tactile feedback in accomplishing handover operations in a collaborative task.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.