2013
DOI: 10.1007/s10514-013-9378-4
|View full text |Cite
|
Sign up to set email alerts
|

A comparison of learning strategies for biologically constrained development of gaze control on an iCub robot

Abstract: Shaw, P. H., Law, J. A., Lee, M. H. (2014). A comparison of learning strategies for biologically constrained development of gaze control on an iCub robot. Autonomous Robots, 37 (1), 97-110Gaze control requires the coordination of movements of both eyes and head to fixate on a target. We present a biologically constrained architecture for gaze control and show how the relationships between the coupled sensorimotor systems can be learnt autonomously from scratch, allowing for adaptation as the system grows or ch… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
3

Relationship

3
4

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 13 publications
0
9
0
Order By: Relevance
“…It should however be noted that in many biological systems, the two are not independent, with the movements of the neck impacting on the movements of the eye. As such, a description of how they can both be learnt can be found in [12]. A similar mapping to that shown in figure 3 can be learnt on-line for linking the visual input to the neck motors for controlling gaze direction.…”
Section: B Neck Mapmentioning
confidence: 97%
See 1 more Smart Citation
“…It should however be noted that in many biological systems, the two are not independent, with the movements of the neck impacting on the movements of the eye. As such, a description of how they can both be learnt can be found in [12]. A similar mapping to that shown in figure 3 can be learnt on-line for linking the visual input to the neck motors for controlling gaze direction.…”
Section: B Neck Mapmentioning
confidence: 97%
“…Details regarding a specific implementation of this mapping, and how it can be learnt, can be found in [12]. An example of a learnt mapping is shown in figure 3.…”
Section: A Saccade Mapmentioning
confidence: 99%
“…Thus, like children, iCub integrates visual, auditory, tactile and proprioceptive information to generate behaviour, for example auditory and visual information in a word learning task (although which modalities contribute to a given simulation are decided a priori by the modeller). Thus far, iCub has captured a range of developmental phenomena, for example motor development (Tikhanoff, Cangelosi, & Metta, 2011), visuomotor development (Shaw, Law, & Lee, 2014), intrinsically motivated exploration (Maestre, Cully, Gonzales, & Doncieux, 2015), affordancebased verb learning (Marocco, Cangelosi, Fischer, & Belpaeme, 2010), and spatiallygrounded noun learning (Morse et al, 2015; for a review see Cangelosi & Schlesinger, 2015).…”
Section: The Icub and The Epigenetic Robotics Architecturementioning
confidence: 99%
“…The internal mechanisms and the algorithms of this learning approach as well as the formulae for calculating the gaze shift (i.e., contribution of eyes and head to perform successful fixations) are found in [24]. Notice that in the eye and head motor maps, the number of new links decreases over time as existing ones are used, giving a level of body-related saturation in the mapping module [25].…”
Section: B Maps Fields and Linksmentioning
confidence: 99%
“…The two learning mechanisms are designed to work in parallel, so that the development of the one directly affects the development of the other. As previously shown in [24], [25], [28], motor babbling has been effectively used to drive the discovery of new experiences between a robotic platform's sensors and motor. This mechanism has been previously evaluated and shown its effectiveness in being a competent approach to achieve robotic sensorimotor learning.…”
Section: E Modelling Regional Perception and Recognitionmentioning
confidence: 99%