2013 IEEE International Conference on Robotics and Automation 2013
DOI: 10.1109/icra.2013.6631000
|View full text |Cite
|
Sign up to set email alerts
|

Using robotic exploratory procedures to learn the meaning of haptic adjectives

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
88
0
2

Year Published

2014
2014
2022
2022

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 94 publications
(90 citation statements)
references
References 16 publications
0
88
0
2
Order By: Relevance
“…It has been especially successful in grounding concepts [9] and objects [42,31] through human-robot interaction. Works like Chu et al [11] look at mapping from text to haptic signals. Kulick et al [29] consider active learning for teaching robot to ground relational symbols.…”
Section: Related Workmentioning
confidence: 99%
“…It has been especially successful in grounding concepts [9] and objects [42,31] through human-robot interaction. Works like Chu et al [11] look at mapping from text to haptic signals. Kulick et al [29] consider active learning for teaching robot to ground relational symbols.…”
Section: Related Workmentioning
confidence: 99%
“…This feature function is denoted as time window feature function. In addition to these feature functions we also show results when using the features introduced by Chu et al [21]. Originally designed with the goal of object property learning, these features are divided into four distinct groups that try to depict not only the compliance, roughness, and thermal properties of the object, but also the correlation between the electrode data retrieved from the sensor.…”
Section: A Feature Comparisonmentioning
confidence: 99%
“…Originally designed with the goal of object property learning, these features are divided into four distinct groups that try to depict not only the compliance, roughness, and thermal properties of the object, but also the correlation between the electrode data retrieved from the sensor. For a more detailed description of these features please refer to [21].…”
Section: A Feature Comparisonmentioning
confidence: 99%
“…Considerable research has been conducted to infer tactile object properties using robots as well. Two recent examples are Drimus et al [2] and Chu et al [3], who introduced methods to infer multiple object properties by analyzing touch sensor input during several controlled exploration procedures.…”
Section: Inroductionmentioning
confidence: 99%