2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2017
DOI: 10.1109/iros.2017.8206059
|View full text |Cite
|
Sign up to set email alerts
|

Associating grasp configurations with hierarchical features in convolutional neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…However, more elaborate systems for action similarity evaluation and action knowledge transfer could be introduced for an extended ADT treatment. Using deep neural network features for sub-symbolic information representation as, for example, introduced by Ku et al (2017a,b) may be promising.…”
Section: Discussionmentioning
confidence: 99%
“…However, more elaborate systems for action similarity evaluation and action knowledge transfer could be introduced for an extended ADT treatment. Using deep neural network features for sub-symbolic information representation as, for example, introduced by Ku et al (2017a,b) may be promising.…”
Section: Discussionmentioning
confidence: 99%
“…This implies that future work should focus on finding ways to analyze and compare activations in deep nets e.g. [72], either by developing retinotopic feedback loops similar to how human vision works or other recurrent ways of learning abstractions, however, without the need for pixel-wise labeling. Further on, when creating these abstractions we need to understand to what a degree we should mimic human capabilities, as this will be a crucial component in human-robot interaction.…”
Section: Discussionmentioning
confidence: 99%
“…Ku et al [43] used convolutional layer activations to find the anthropomorphic grasping points in images. They created a mask that represented the grasping points for the robotic index finger and the thumb.…”
Section: Typementioning
confidence: 99%