2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids) 2016
DOI: 10.1109/humanoids.2016.7803263
|View full text |Cite
|
Sign up to set email alerts
|

Combining sensory modalities and exploratory procedures to improve haptic object recognition in robotics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
4

Relationship

3
5

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 14 publications
0
5
0
Order By: Relevance
“…Gorges et al [13] merge a sequence of grasps into a statistical description of the object that is used to classify the objects. In a recent work Higy et al [14] propose a method in which the robot identifies an object by carrying out different exploratory behaviours such as hand closure, and weighing and rotating the object. In their method the authors fuse sensor data from multiple sensors in a hierarchical classifier to differentiate objects.…”
Section: Introductionmentioning
confidence: 99%
“…Gorges et al [13] merge a sequence of grasps into a statistical description of the object that is used to classify the objects. In a recent work Higy et al [14] propose a method in which the robot identifies an object by carrying out different exploratory behaviours such as hand closure, and weighing and rotating the object. In their method the authors fuse sensor data from multiple sensors in a hierarchical classifier to differentiate objects.…”
Section: Introductionmentioning
confidence: 99%
“…Training instances in this scenario could be gathered autonomously by detecting invariances in the data which correspond to physical entities (e.g., coherent motion patterns (Wang and Gupta 2015) or bottom-up saliency cues). Strategies specific to the robotic domain could be devised by integrating multiple sensory modalities (Sinapov et al 2014;Higy et al 2016) and a repertoire of explorative actions (Montesano et al 2008;Fitzpatrick et al 2003;Högman et al 2016;).…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…Clearly, visual perception is only one of the possible sensory modalities equipping modern robots, that can be involved in the object recognition process (see for example (Luo et al 2017;Higy et al 2016)). In addition it has been shown that the physical interaction with the arXiv:1709.09882v2 [cs.RO] 3 Jan 2019 environment can be used to aid perception , demonstrating that there is more than "just" vision to object recognition.…”
Section: Introductionmentioning
confidence: 99%
“…Visuo-tactile methods have attained attributes such as elasticity, mass and relational constraints [10] and object pose [11], [12]. Shape has been determined with methods such as tactile glances at discrete points on the object [13], visual and tactile feedback from grasping [14], combining visual and tactile exploratory procedures [15] and visuo-tactile fusion [16].…”
Section: Background and Related Workmentioning
confidence: 99%