This paper presents a model of ocular-motor development, inspired by ideas and data from developmental psychology. The learning problem concerns the growth of the transform between image space and motor space necessary for the control of visual saccades. An implementation is used to produce experimental results and these are presented and discussed. The algorithm is simple, extremely fast, self calibrating, adaptive to change, and exhibits emergent stages of behaviour as learning progresses.
Developmental robotics is concerned with the design of algorithms that promote robot adaptation and learning through qualitative growth of behaviour and increasing levels of competence.This paper uses ideas and inspiration from early infant psychology (up to 3 months of age) to examine how robot systems could discover the structure of their local sensory-motor spaces and learn how to coordinate these for the control of action.An experimental learning model is described and results from robotic experiments using the model are presented and discussed.
Q. Meng and M. H Lee, Automated cross-modal mapping in robotic eye/hand systems using plastic radial basis function networks, Connection Science, 19(1), pp 25-52, 2007.Advanced autonomous artificial systems will need incremental learning and adaptive abilities similar to those seen in humans. Knowledge from biology, psychology and neuroscience is now inspiring new approaches for systems that have sensory-motor capabilities and operate in complex environments. Eye/hand coordination is an important cross-modal cognitive function, and is also typical of many of the other coordinations that must be involved in the control and operation of embodied intelligent systems. This paper examines a biologically inspired approach for incrementally constructing compact mapping networks for eye/hand coordination. We present a simplified node-decoupled extended Kalman filter for radial basis function networks, and compare this with other learning algorithms. An experimental system consisting of a robot arm and a pan-and-tilt head with a colour camera is used to produce results and test the algorithms in this paper. We also present three approaches for adapting to structural changes during eye/hand coordination tasks, and the robustness of the algorithms under noise are investigated. The learning and adaptation approaches in this paper have similarities with current ideas about neural growth in the brains of humans and animals during tool-use, and infants during early cognitive development.Peer reviewe
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.