IEEE/RSJ International Conference on Intelligent Robots and System
DOI: 10.1109/irds.2002.1043938
|View full text |Cite
|
Sign up to set email alerts
|

Vision-based urban navigation procedures for verbally instructed robots

Abstract: The work presented in this thesis is part of a project in instruction based learning (IBL) for mobile robots were a robot is designed that can be instructed by its users through unconstrained natural language. The robot uses vision guidance to follow route instructions in a miniature town model.The aim of the work presented here was to detenn.ine the functional vocabulary of the robot in d1e form of "primitive procedures". In contrast to previous work in the field of instructable robots this was done following… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(26 citation statements)
references
References 27 publications
0
26
0
Order By: Relevance
“…Based on an analysis of the resulting speech corpora, they identified a set of verbal action chunks that could map onto robot control primitives. More recently, they demonstrated the effectiveness of navigation instructions translated into these primitive procedures for actual robot navigation (Kyriacou et al 2005). This research indicates the importance of implementing the mapping between language and behavioural primitives for high-level natural language instruction or programming.…”
Section: Spoken Language Programmingmentioning
confidence: 89%
See 1 more Smart Citation
“…Based on an analysis of the resulting speech corpora, they identified a set of verbal action chunks that could map onto robot control primitives. More recently, they demonstrated the effectiveness of navigation instructions translated into these primitive procedures for actual robot navigation (Kyriacou et al 2005). This research indicates the importance of implementing the mapping between language and behavioural primitives for high-level natural language instruction or programming.…”
Section: Spoken Language Programmingmentioning
confidence: 89%
“…In this context, two domains of interaction that humans exploit with great fidelity are spoken language, and the visual ability to observe and understand intentional action. A good deal of research effort has been dedicated to the specification and implementation of spoken language systems for human-robot interaction (Crangle & Suppes 1994, Lauria et al 2002, Severinson-Eklund 2003, Kyriacou et al 2005, Mavrides & Roy 2006. The research described in the current chapter extends these approaches with a Spoken Language Programming system that allows a more detailed specification of conditional execution, and by using language as a compliment to vision-based action perception as a mechanism for indicating how things are to be done, in the context of cooperative, turn-taking behavior.…”
Section: Discussionmentioning
confidence: 99%
“…Many research efforts focus on using spatial language to control the robot's position and behavior, or to enable it to answer questions about what it senses. In Lauria et al (2001), Kyriacou et al (2002), Bugmann et al (2004), InstructionBased Learning (IBL) is built to train mobile robots using natural language instruction to describe a navigate task. In this project, a robot is instructed on how to travel from one place to another in a miniature town.…”
Section: Related Workmentioning
confidence: 99%
“…Kyriacou et.al. describe in [6] a verbally instructed robot that executes a given route description in an artificial miniature town. Encouraged by corpora analysis they transfer a given route description into a list of action chunks accompanied by spatial relations and landmarks.…”
Section: Related Workmentioning
confidence: 99%