2023
DOI: 10.1109/tro.2023.3245983
|View full text |Cite
|
Sign up to set email alerts
|

Simulation, Learning, and Application of Vision-Based Tactile Sensing at Large Scale

Abstract: Large-scale robotic skin with tactile sensing ability is emerging with the potential for use in close-contact human-robot systems. Although recent developments in vision-based tactile sensing and related learning methods are promising, they have been mostly designed for small-scale use, such as by fingers and hands, in manipulation tasks. Moreover, learning perception for such tactile devices demands a huge tactile dataset, which complicates the data collection process. To address this, this study introduces a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(7 citation statements)
references
References 42 publications
1
6
0
Order By: Relevance
“…The localization error in our work is 6.22-7.19mm. This range is comparable to that in Luu et al [40] and to the mesh size of 18mm × 9.5mm employed for localization in Duong et al [38] . To the best of our knowledge, very few studies in literature focused on force estimation in large area tactile sensing.…”
Section: Discussionsupporting
confidence: 79%
See 2 more Smart Citations
“…The localization error in our work is 6.22-7.19mm. This range is comparable to that in Luu et al [40] and to the mesh size of 18mm × 9.5mm employed for localization in Duong et al [38] . To the best of our knowledge, very few studies in literature focused on force estimation in large area tactile sensing.…”
Section: Discussionsupporting
confidence: 79%
“…The methodology in this paper is limited to normal force estimation and is not suitable for shear force sensing. In contrast, the other methodologies based on marker tracking [35][36][37][38][39][40], in principle, allow for complete mapping of the deformation field and hence the potential for directional and shear forces. In summary, the ELTac sensing module offers high localization and force estimation accuracies as well as multi-point contact detection capabilities all enabled by a single camera.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Narang et al built a linear-FEM-based tactile simulator for BioTac with Isaac Gym [31], achieving faster speeds than the commercial FEM software (ANSYS) [32]. Recently, Luu et al employed SOFA [33] to build a simulator for large-scale marker-cumvision-based-tactile sensor [34]. However, both [31] and [34] primarily used their simulators to collect supervised datasets for interpreting tactile signals, leaving the potential of using simulation to train manipulation policies unexplored.…”
Section: B Tactile Sensor Simulationmentioning
confidence: 99%
“…The rapid development of vision technology has promoted the hardness recognition [6] of objects, but with the rapid development, its vision technology also possesses certain limitations, when recognizing objects with similar appearance and shape, such as plasticine and sand with the same color, it is more difficult to recognize objects through vision technology, Althoefer K et al [7] proposed to use a miniature tactile sensor, which is based on multiple gradients of force, and the change of force and shape of the object to obtain tactile information, and input the collected tactile information into a deep learning network to process the collected feature information. Luu Q K et al [8] proposed a new tactile algorithm network, which simulates the data set to train the tactile neural network to extract deep tactile information through the tactile network. Sundaram et al [9] demonstrated that an array of piezoresistive pressure sensors was assembled on a knitted glove to produce a tactile data set by contacting twenty-six types of objects, and the collected tactile data were fed into a deep convolutional neural network [10] to obtain the hardness classification of the objects [11].…”
Section: Introductionmentioning
confidence: 99%