2022
DOI: 10.1109/tie.2021.3090697
|View full text |Cite
|
Sign up to set email alerts
|

In-Hand Object Localization Using a Novel High-Resolution Visuotactile Sensor

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 66 publications
(19 citation statements)
references
References 33 publications
0
19
0
Order By: Relevance
“…The same group also estimated the in-hand object pose by comparing the geometry pre-annotated based on the object model and the tactile information obtained from pressure sensors embedded on a Barrett robot hand [1]. Li et al [16], and Cui et al [17] used a GelSight sensor to estimate and adjust various workpieces for successful insertion. Pirozzi et al [18] used a photo reflector array to measure the deformation of soft finger pads and estimate the in-hand pose of electric wires.…”
Section: B Tactile-based Methodsmentioning
confidence: 99%
“…The same group also estimated the in-hand object pose by comparing the geometry pre-annotated based on the object model and the tactile information obtained from pressure sensors embedded on a Barrett robot hand [1]. Li et al [16], and Cui et al [17] used a GelSight sensor to estimate and adjust various workpieces for successful insertion. Pirozzi et al [18] used a photo reflector array to measure the deformation of soft finger pads and estimate the in-hand pose of electric wires.…”
Section: B Tactile-based Methodsmentioning
confidence: 99%
“… Optical flow method [28], [37]  Finite element mode [63], [44]  Neural network [58], [53] mainly using the machine learningbased approaches  Speckle detection [77], [80]  Feature enhancement [81], [82] mainly using the physical model-based approaches  Stereo vision [25], [94]  Virtual stereo vision [23], [26] mainly using the physical model-based approaches  Contact area [24]  2D force distribution [17]  Slip field [79]  Contact area [20]  2D force distribution [86]  Slip field [21]  Contact area [91]  Friction coefficient [23]  2D force distribution [97] 3D tactile perception  Geometric features [46]  3D geometry [18]  3D force distribution [63]  Geometric features [82]  3D geometry [82]  3D force distribution [87]  Geometric features [ commonly used in the field of visuotactile sensing. By using a camera to photograph the marks prepared on the sensor contact elastomer, a tactile image containing the position change of the markers can be obtained, and the tactile information can be further obtained by post-processing and analyzing the tactile image.…”
Section: Common Technologiesmentioning
confidence: 99%
“…Tactile sensors have been integrated into robotic hands to detect the interaction force and location. 130 However, there is still some limitation to be solved in tactile sensors. This first issue is how to improve the flexibility of tactile sensors.…”
Section: Novel Tactile Sensors For Dexterous Manipulationmentioning
confidence: 99%