2006
DOI: 10.1007/s10514-006-9018-3
|View full text |Cite
|
Sign up to set email alerts
|

First steps toward natural human-like HRI

Abstract: Natural human-like human-robot interaction (NHL-HRI) requires the robot to be skilled both at recognizing and producing many subtle human behaviors, often taken for granted by humans. We suggest a rough division of these requirements for NHL-HRI into three classes of properties: (1) social behaviors, (2) goal-oriented cognition, and (3) robust intelligence, and present the novel DIARC architecture for complex affective robots for human-robot interaction, which aims to meet some of those requirements. We briefl… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
59
0
1

Year Published

2010
2010
2022
2022

Publication Types

Select...
3
3
3

Relationship

3
6

Authors

Journals

citations
Cited by 142 publications
(65 citation statements)
references
References 26 publications
0
59
0
1
Order By: Relevance
“…While much research in HRI toward this goal has focused on the effects of robot appearance and observable behavior, a significant aspect of natural HRI is communication in natural language (e.g., Scheutz, Schermerhorn, Kramer, & Anderson, 2007), which has only recently received significant attention. Recent research has investigated various social aspects of natural language interactions with robots, such as politeness (e.g., , turn taking (e.g., Nadel, Revel, Andry, & Gaussier, 2004), affective speech (e.g., Scheutz, Schermerhorn, & Kramer, 2006), dialogueappropriate facial movements (e.g., Liu, Ishi, Ishiguro, & Hagita, 2012), pragmatic analysis (e.g., Williams, Briggs, Oosterveld, & Scheutz, 2015), and collaborative control (e.g., Fong, Thorpe, & Baur, 2003).…”
Section: Introductionmentioning
confidence: 99%
“…While much research in HRI toward this goal has focused on the effects of robot appearance and observable behavior, a significant aspect of natural HRI is communication in natural language (e.g., Scheutz, Schermerhorn, Kramer, & Anderson, 2007), which has only recently received significant attention. Recent research has investigated various social aspects of natural language interactions with robots, such as politeness (e.g., , turn taking (e.g., Nadel, Revel, Andry, & Gaussier, 2004), affective speech (e.g., Scheutz, Schermerhorn, & Kramer, 2006), dialogueappropriate facial movements (e.g., Liu, Ishi, Ishiguro, & Hagita, 2012), pragmatic analysis (e.g., Williams, Briggs, Oosterveld, & Scheutz, 2015), and collaborative control (e.g., Fong, Thorpe, & Baur, 2003).…”
Section: Introductionmentioning
confidence: 99%
“…Our proposed architecture integrates speech recognition, incremental parsing, incremental semantic analysis, disfluency analysis, and situated reference resolution components into our robotic DIARC architecture [5], which provides mechanisms for integrating incremental natural language processing components with action execution [16], [20]. The present work expands the capabilities of the previous work with the inclusion of a more sophisticated parsing mechanism, allowing greater flexibility and more robust natural language understanding.…”
Section: An Integrated Architecture For Robust Spoken Instruction mentioning
confidence: 98%
“…In contrast, converging evidence from psycholinguistics suggests that human language understanding is incremental and parallel, depends on the speaker's and listener's contexts, utilizes task and goal knowledge, and involves the perceptions and perspectives of situated, embodied agents. While this difference in processing style may not matter for many NLU applications, it is critical in situations where humans and robots interact naturally as embodied agents co-located in the same environment [5].…”
Section: Introductionmentioning
confidence: 99%
“…In this section, we describe how each stage of our HRIoriented clarification request generation framework [26] is handled by components of the DIARC architecture [18].…”
Section: Approachmentioning
confidence: 99%