2021
DOI: 10.1109/lra.2021.3096239
|View full text |Cite
|
Sign up to set email alerts
|

DDGC: Generative Deep Dexterous Grasping in Clutter

Abstract: Recent advances in multi-fingered robotic grasping have enabled fast 6-Degrees-of-Freedom (DOF) single object grasping. Multi-finger grasping in cluttered scenes, on the other hand, remains mostly unexplored due to the added difficulty of reasoning over obstacles which greatly increases the computational time to generate high-quality collision-free grasps. In this work, we address such limitations by introducing DDGC, a fast generative multifinger grasp sampling method that can generate high quality grasps in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 46 publications
(26 citation statements)
references
References 33 publications
0
24
0
Order By: Relevance
“…The detailed predictions for each object include its 3D model, all naturally possible hand grasp types defined in Feix et al ( 2016 ) and the corresponding refined 51-DoF 3D hand models by minimizing a graspability loss. Generative Deep Dexterous Grasping in Clutter (DDGC) method (Lundell et al, 2021 ) has a similar structure as GANhand, but it adds depth channel and directly.…”
Section: Learning-based Manipulation Methodsmentioning
confidence: 99%
“…The detailed predictions for each object include its 3D model, all naturally possible hand grasp types defined in Feix et al ( 2016 ) and the corresponding refined 51-DoF 3D hand models by minimizing a graspability loss. Generative Deep Dexterous Grasping in Clutter (DDGC) method (Lundell et al, 2021 ) has a similar structure as GANhand, but it adds depth channel and directly.…”
Section: Learning-based Manipulation Methodsmentioning
confidence: 99%
“…Kokic et al [54] randomly sample grasp and roll angles, and offset distances for each point in the point cloud. Both Lu et al [43], Lundell et al [55,57] sample grasp candidates around the center of an object with a random orientation.…”
Section: A Samplingmentioning
confidence: 99%
“…Similarly, Ottenhaus et al [44] train a CNN to estimate the force-closure probability of a grasp under a small random perturbation. Lundell et al [55] and Lundell et al [57] trained a grasp classifier using a Generative Adversarial Networks (GANs), and use the discriminator loss to help produce realistic-looking grasps. Wen et al [65] compute a continuous score for each grasp by looking at the stability of randomly sampled grasps in the proximity of the selected grasp.…”
Section: A Samplingmentioning
confidence: 99%
See 1 more Smart Citation
“…[10]). More recent papers have shifted to complex model architectures such as generative models [13,14] and convolutional neural networks [15,16] to produce grasps directly. The main limitation of these works lies in restricted hardware evaluation.…”
Section: Simple and Dexterous Grasping In Literaturementioning
confidence: 99%