2016
DOI: 10.1109/toh.2016.2521378
|View full text |Cite
|
Sign up to set email alerts
|

Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors

Abstract: Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
87
0
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 139 publications
(89 citation statements)
references
References 42 publications
1
87
0
1
Order By: Relevance
“…• Hand Crafted. This is from [29], where raw signals from three specific events (before contact, when the finger closing movement is stalled due to object-finger contact, after the fingers are in equilibrium) are extracted. Choice of Classifiers.…”
Section: A Learning Haptic Featuresmentioning
confidence: 99%
“…• Hand Crafted. This is from [29], where raw signals from three specific events (before contact, when the finger closing movement is stalled due to object-finger contact, after the fingers are in equilibrium) are extracted. Choice of Classifiers.…”
Section: A Learning Haptic Featuresmentioning
confidence: 99%
“…Both tactile features and joint configurations are clustered by SOMs and classified by ANNs separately and the classification results are merged to achieve a final decision. In a more recent work [39], the actuator positions of robot fingers and tactile sensor values form the feature space to classify object classes using random forests but there are no exploratory motions involved, with data acquired during a single and unplanned grasp.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In [21], the proprioceptive data (finger configurations/positions) and tactile features for a whole palpation sequence were concatenated into one description for object classification; the information of tactile features and contact points is combined but the information of the positions where specific tactile features were collected was lost. In recent work [22], an underactuated robot hand, with a row of TakkTile tactile sensors embedded in each link of robot fingers, was employed for object classification. The actuator positions and force sensor values form the feature space to classify object classes using random forests but there were no exploratory motions involved, with data acquired during a single and unplanned grasp.…”
Section: Object Recognition Based On Both Sensing Modalitiesmentioning
confidence: 99%
“…In total, four different features are used and compared, i.e., the Tactile-SIFT descriptors proposed in our previous work [17] and three previous features in the literature, i.e., Zernike moments (the best performing feature used in [15]), normalized Hu's moments [22], raw image moments (up to order 2) [23]. Based on [17], the dictionary size k was set to 50 through the experiments.…”
Section: A Recognition Performances Of Bow Framework With Different mentioning
confidence: 99%
See 1 more Smart Citation