2012 IEEE/RSJ International Conference on Intelligent Robots and Systems 2012
DOI: 10.1109/iros.2012.6386297
|View full text |Cite
|
Sign up to set email alerts
|

The power of prediction: Robots that read intentions

Abstract: Humans are experts in cooperating in a smooth and proactive manner. Action and intention understanding are critical components of efficient joint action. In the context of the EU Integrated Project JAST [16] we have developed an anthropomorphic robot endowed with these cognitive capacities. This project and respective robot (ARoS) is the focus of the video. More specifically, the results illustrate crucial cognitive capacities for efficient and successful human-robot collaboration such as goal inference, error… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 21 publications
0
5
0
Order By: Relevance
“…Besides emotion and atmosphere, multi-robot behavior adaptation to intention is also a potential study base on the proposed intention understanding model. In some real application of HRI, intention understanding has been successfully applied in some autonomous robots, e.g., Pioneer 2DX robot [45] and ARoS [46], from which we infer the proposed TLFSVR based intention understanding model would be an effective way to autonomous robots for deeply understanding peoples' inner thoughts in HRI.…”
Section: Resultsmentioning
confidence: 98%
See 1 more Smart Citation
“…Besides emotion and atmosphere, multi-robot behavior adaptation to intention is also a potential study base on the proposed intention understanding model. In some real application of HRI, intention understanding has been successfully applied in some autonomous robots, e.g., Pioneer 2DX robot [45] and ARoS [46], from which we infer the proposed TLFSVR based intention understanding model would be an effective way to autonomous robots for deeply understanding peoples' inner thoughts in HRI.…”
Section: Resultsmentioning
confidence: 98%
“…As the development of the cognitive science, in the real application of human-robot interaction, intention understanding has been successfully applied in some autonomous robots, e.g., Pioneer 2DX robot understands people's walking direction from experience using the Hidden Markov Models [45] and the ActivMedia Robotics Operating System (ARoS) robot as sociable partners in collaborative joint activity based on cognitively understanding of people's actions [46], showing the evidence that the proposed TLFSVR based intention understanding model may be an effective way to autonomous robots for deeply understanding peoples' inner thoughts in human-robot interaction.…”
Section: Application Experiments In Mrsmentioning
confidence: 99%
“…It implements a highly context-sensitive mapping of an observed action of the co-worker onto an adequate complementary robot behavior. This mapping takes into account different task-related and user-related factors including an error monitoring capacity [6,8,54]. Neural populations encode in their suprathreshold activity a mismatch between which assembly step the operator should execute (shared task knowledge) and the predicted assembly step inferred from the observed motor behavior.…”
Section: Discussionmentioning
confidence: 99%
“…In neuroscience, the ideomotor principle (IMP) [114] emphasises the importance of anticipating (implicitly or explicitly) the sensory consequences of our actions and the actions of others for: 1) adaptive behaviour, 2) guidance of attention, 3) mentalising abilities and 4) social learning, all of which are a powerful means for building artificial cognitive systems that can acquire new knowledge autonomously, that learn from humans or that adapt to particular environments and preferences of the users [53,115].…”
Section: B Automatic Anticipatory and Predictive Mechanisms For Closmentioning
confidence: 99%