Abstract. This paper describes Team Delft's robot, which won the Amazon Picking Challenge 2016, including both the Picking and the Stowing competitions. The goal of the challenge is to automate pick and place operations in unstructured environments, specifically the shelves in an Amazon warehouse. Team Delft's robot is based on an industrial robot arm, 3D cameras and a customized gripper. The robot's software uses ROS to integrate off-the-shelf components and modules developed specifically for the competition, implementing Deep Learning and other AI techniques for object recognition and pose estimation, grasp planning and motion planning. This paper describes the main components in the system, and discusses its performance and results at the Amazon Picking Challenge 2016 finals.
This paper presents our research platform SafeVRU for the interaction of self-driving vehicles with Vulnerable Road Users (VRUs, i.e., pedestrians and cyclists). The paper details the design (implemented with a modular structure within ROS) of the full stack of vehicle localization, environment perception, motion planning, and control, with emphasis on the environment perception and planning modules. The environment perception detects the VRUs using a stereo camera and predicts their paths with Dynamic Bayesian Networks (DBNs), which can account for switching dynamics. The motion planner is based on model predictive contouring control (MPCC) and takes into account vehicle dynamics, control objectives (e.g., desired speed), and perceived environment (i.e., the predicted VRU paths with behavioral uncertainties) over a certain time horizon. We present simulation and real-world results to illustrate the ability of our vehicle to plan and execute collision-free trajectories in the presence of VRUs. I. INTRODUCTION Every year between 20 and 50 million people are involved in road accidents, mostly caused by human errors [1]. According to [1], approximately 1.3 million people lost their life in these accidents. Half of the victims are vulnerable road users (VRUs), such as pedestrians and cyclists. Self-driving vehicles can help reduce these fatalities [2]. Active safety features, such as autonomous emergency braking (AEB), are increasingly found on-board vehicles on the market to improve VRUs' safety (see [3] for a recent overview). In addition, some vehicles already automate steering functionality (e.g., [4], [5]), but still require the driver to initiate the maneuver. Major challenges must be addressed to ensure safety and performance while driving in complex urban environments [6], where VRUs are present. The self-driving vehicle should be aware of the presence of the VRUs and be able to infer their intentions to plan its path accordingly to avoid collisions. In this respect, motion planning methods are required to provide safe (collision-free) and systemcompliant performance in complex environments with static and moving obstacles (refer to [7], [8] for an overview). In real-world applications, the information on the pose (i.e., position and orientation) of other traffic participants comes from a perception module. The perception module should provide to the planner information not only concerning the current position of the other road users, but also † The authors equally contributed to the paper.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.