2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020
DOI: 10.1109/iros45743.2020.9341482
|View full text |Cite
|
Sign up to set email alerts
|

Affordance-Based Grasping and Manipulation in Real World Applications

Abstract: In real world applications, robotic solutions remain impractical due to the challenges that arise in unknown and unstructured environments. To perform complex manipulation tasks in complex and cluttered situations, robots need to be able to identify the interaction possibilities with the scene, i. e. the affordances of the objects encountered. In unstructured environments with noisy perception, insufficient scene understanding and limited prior knowledge, this is a challenging task. In this work, we present an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 21 publications
(6 citation statements)
references
References 20 publications
0
6
0
Order By: Relevance
“…AeroVR (Yashin et al, 2019) ARMAR-6 (Pohl et al, 2020) ModelSegmentation (Kohn et al, 2018) AvatarDrone (Kim and Oh, 2021) PaintCopter (Vempati et al, 2019) AR (Liu and Shen, 2020) AR (Puljiz et al, 2020) GraspLook (Ponomareva et al, 2021) The proposed system tasks in industrial settings. By executing over 70 executions of the aforementioned tasks over days and nights, from spring to winter, and with different users and locations, the benefits of our VR based telepresence concept are illustrated for enhancing aerial manipulation capabilities in real world industrial applications.…”
Section: Outsidementioning
confidence: 99%
See 1 more Smart Citation
“…AeroVR (Yashin et al, 2019) ARMAR-6 (Pohl et al, 2020) ModelSegmentation (Kohn et al, 2018) AvatarDrone (Kim and Oh, 2021) PaintCopter (Vempati et al, 2019) AR (Liu and Shen, 2020) AR (Puljiz et al, 2020) GraspLook (Ponomareva et al, 2021) The proposed system tasks in industrial settings. By executing over 70 executions of the aforementioned tasks over days and nights, from spring to winter, and with different users and locations, the benefits of our VR based telepresence concept are illustrated for enhancing aerial manipulation capabilities in real world industrial applications.…”
Section: Outsidementioning
confidence: 99%
“…As the main challenge of reconstruction based methods is the limited bandwidth in communication, Kohn et al (2018) proposes an object recognition pipeline, i.e., replace the detected object with sparse virtual meshes and discard the dense sensor data. Pohl et al (2020) uses RGB-D sensor to construct a VR for affordance based manipulation with a humanoid, while Liu and Shen (2020) and Puljiz et al (2020) create augmented reality for a drone and a manipulator, respectively. Pace et al (2021) conducts a user study and argues that the point clouds of RGB-D sensors are noisy and inaccurate (with artifacts), which motivates for point cloud preprocessing methods for telepresence applications (Pace et al, 2021).…”
Section: Related Workmentioning
confidence: 99%
“…VGN [4] predicts 6-DoF grasps in clutter with a one-stage pipeline from input depth images. There is also a line of works that estimate affordance of an object or a scene first and then detect grasps based on estimated affordance [42,24,54]. In most of the prior works, deep networks are trained end-to-end with only grasp supervision.…”
Section: A Learning Grasp Detectionmentioning
confidence: 99%
“…Grasping is one of the most important abilities of a robotic system. Here, we consider the example of an affordance-based grasping pipeline as in [50]. In such a pipeline, an affordance extraction component first visually detects grasping affordances of an object or parts of the scene.…”
Section: Grasping and Manipulationmentioning
confidence: 99%