2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018
DOI: 10.1109/iros.2018.8594159
|View full text |Cite
|
Sign up to set email alerts
|

Object Recognition Through Active Sensing Using a Multi-Fingered Robot Hand with 3D Tactile Sensors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
28
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 38 publications
(31 citation statements)
references
References 20 publications
1
28
0
1
Order By: Relevance
“…When attempting an item classification task (Test I: Tactile Sensing Test I: Item Classification section), we obtain 93% validation accuracy on 26 objects randomly placed in the tray to vary the grasp. This performance is comparable to other studies, such as Spiers et al 22 (94%), Schmitz et al 16 (88%), and Funabashi et al 25 (95%) but it is surpassed by Flintoff et al (99%). That said, a direct comparison between studies is not possible because the hands and tactile sensors differ, along with the objects and experiments.…”
Section: Tactile Sensingsupporting
confidence: 89%
See 1 more Smart Citation
“…When attempting an item classification task (Test I: Tactile Sensing Test I: Item Classification section), we obtain 93% validation accuracy on 26 objects randomly placed in the tray to vary the grasp. This performance is comparable to other studies, such as Spiers et al 22 (94%), Schmitz et al 16 (88%), and Funabashi et al 25 (95%) but it is surpassed by Flintoff et al (99%). That said, a direct comparison between studies is not possible because the hands and tactile sensors differ, along with the objects and experiments.…”
Section: Tactile Sensingsupporting
confidence: 89%
“…A grasp stabilization phase resulted in an increase of up to 29% accuracy on object recognition. Funabashi et al 25 use uSkin tactile sensors on an Allegro hand alongside proprioceptive joint angles as the input to a CNN. Objects are placed into the hand, and 95% accuracy is obtained when classifying 20 objects: 10 from the YCB set and 10 cylindrical objects.…”
Section: Introductionmentioning
confidence: 99%
“…Para llevar a cabo este tipo de tareas es necesario disponer de una mano o pinza robótica con un diseño adaptativo que presente a la vez un agarre firme de la persona y pueda controlar su fuerza y movimientos para garantizar la seguridad. En [11,12,13] se utiliza información de la presión ejercida durante el agarre que, mediante técnicas de aprendizaje automático (Machine Learning), se puede utilizar para obtener información de alto nivel, clasificar el contacto, e incluso reconocer objetos.…”
Section: Introductionunclassified
“…In manipulation, tactile sensors are used in different ways. On one side, when the object model is partially or completely unknown tactile sensors are used to reduce uncertainty and adjust the object geometric model, which can be used to recognize the object (using a data base) as well as to precisely identify the actual position of the object within the hand [26][27][28][29]. Another approach use tactile sensors in manipulation strategies based on the tactile feedback without caring about the object model, that is, the manipulation is performed even when the object model is completely unknown [18,30].…”
mentioning
confidence: 99%
“…On the other hand, regarding complementary hardware, vision systems has been used to complement the tactile information in the exploration of unknown objects [37], an regarding complementary software, machine learning approaches has been proposed for the detection of slippage [38], for the object recognition [29], for the adaptation of the grasping motion [39], and for the extraction of manipulation primitives for compliant robot hands [14].…”
mentioning
confidence: 99%