2020
DOI: 10.1007/978-3-030-33950-0_33
|View full text |Cite
|
Sign up to set email alerts
|

Learning to Grasp Without Seeing

Abstract: Can a robot grasp an unknown object without seeing it? In this paper, we present a tactile-sensing based approach to this challenging problem of grasping novel objects without prior knowledge of their location or physical properties. Our key idea is to combine touch based object localization with tactile based regrasping. To train our learning models, we created a large-scale grasping dataset, including more than 30K RGB frames and over 2.8 million tactile samples from 7800 grasp interactions of 52 objects. To… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
32
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 37 publications
(32 citation statements)
references
References 46 publications
0
32
0
Order By: Relevance
“…An indepth exploration of the tradeoffs between data-driven and analytic approaches would an interesting future topic of study. Another concurrent work [32], explores grasping with a 3axis force sensor, but reports comparatively low success rates, focusing instead on tactile localization without vision. Our method uses rich touch sensing that is aware of texture and surface shape, simultaneously incorporates multiple modalities, and can flexibly accommodate additional constraints, such as minimum-force grasps.…”
Section: B Tactile Sensors In Graspingmentioning
confidence: 99%
“…An indepth exploration of the tradeoffs between data-driven and analytic approaches would an interesting future topic of study. Another concurrent work [32], explores grasping with a 3axis force sensor, but reports comparatively low success rates, focusing instead on tactile localization without vision. Our method uses rich touch sensing that is aware of texture and surface shape, simultaneously incorporates multiple modalities, and can flexibly accommodate additional constraints, such as minimum-force grasps.…”
Section: B Tactile Sensors In Graspingmentioning
confidence: 99%
“…More recently, researchers have also proposed other, more strongly supervised techniques for inferring object properties from touch. For example [8] proposed estimating the hardness of an object using a convolutional network, while [9], [10], [11] estimated material and textural properties. However, to our knowledge, none of these prior works have demonstrated that object instances (rather than just material properties) can be recognized entirely by touch and matched to corresponding visual observations.…”
Section: Related Workmentioning
confidence: 99%
“…Aside from recognition and perception, tactile information has also been extensively utilized for directly performing robotic manipulation skills, especially grasping. For example, [12], [13], [14], [10] predicted how suitable a given gripper configuration was for grasping. We take inspiration from these approaches, and use the tactile exploration strategy of [12], whereby the robot "feels" a random location of an object using a two-fingered gripper equipped with two GelSight [15], [16] touch sensors.…”
Section: Related Workmentioning
confidence: 99%
“…Vision is used to infer the geometric shape [24], track objects [40], infer object categories [26] and even direct control [27]. In recent years, the sense of touch has also received increasing attention for recognition [37] and feedback control [30]. But what about sound?…”
Section: Introductionmentioning
confidence: 99%