Robotics: Science and Systems XIX 2023
DOI: 10.15607/rss.2023.xix.089
|View full text |Cite
|
Sign up to set email alerts
|

LEAP Hand: Low-Cost, Efficient, and Anthropomorphic Hand for Robot Learning

Kenneth Shaw,
Ananye Agarwal,
Deepak Pathak

Abstract: Fig. 1: (a) LEAP Hand is an anthropomorphic dexterous robot hand designed for robot learning research. It can be assembled in under 4 hours for 2000 USD, is composed of readily available parts, and is robust. (b) to-scale comparison of LEAP Hand and a human hand (c-h) LEAP Hand in different power and precision grasps holding common objects. The hand design and code will be open-sourced to democratize access to hardware for anthropomorhic dexterous manipulation. Videos at https://leap-hand.github.io/Abstract-De… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 48 publications
0
4
0
Order By: Relevance
“…We call this system Videodex as it learns dexterity from video. This builds on work published at CoRL 2022 by Shaw et al, (2023b).…”
Section: Introductionmentioning
confidence: 90%
See 2 more Smart Citations
“…We call this system Videodex as it learns dexterity from video. This builds on work published at CoRL 2022 by Shaw et al, (2023b).…”
Section: Introductionmentioning
confidence: 90%
“…We collect separate demonstrations on the real robot using the 2-finger gripper from xArm UFactory. Separate action priors are trained for the 16 DoF LEAP Hand (Shaw et al, 2023a) and the 2finger gripper.…”
Section: Learning Priors From 1st Person Human Videomentioning
confidence: 99%
See 1 more Smart Citation
“…Egocentric video capture is proposed as a potential alternative solution and a powerful new evaluation tool for clinicians and researchers to evaluate hand and device use in unstructured environments. This has particular promise with the development of new ML tools for grasp/hand detection to reduce the laborious manual video tagging process for data analysis [17], [18]. In fact, this population has already demonstrated its openness to the use of this tool [19].…”
Section: A Background On Grasp Taxonomy Usementioning
confidence: 99%