2007 IEEE/ICME International Conference on Complex Medical Engineering 2007
DOI: 10.1109/iccme.2007.4381776
|View full text |Cite
|
Sign up to set email alerts
|

An experiment study of gesture-based human-robot interface

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2007
2007
2021
2021

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…To control the effect of different time periods for different sessions, the three session of every participant were concatenated and then quantized into ten time slots and the analysis was done according to those time slots rather than the actual time. The figure shows a tendency to reduce the number of unknown gestures used with time in all cases which indicates that the subjects were able to limit their gestures to the kinds understood by the robot (this corresponds to the replace behavior in [9]). Detailed analysis of the figure shows that this adaptation behavior is Fig.…”
Section: Resultsmentioning
confidence: 92%
See 1 more Smart Citation
“…To control the effect of different time periods for different sessions, the three session of every participant were concatenated and then quantized into ten time slots and the analysis was done according to those time slots rather than the actual time. The figure shows a tendency to reduce the number of unknown gestures used with time in all cases which indicates that the subjects were able to limit their gestures to the kinds understood by the robot (this corresponds to the replace behavior in [9]). Detailed analysis of the figure shows that this adaptation behavior is Fig.…”
Section: Resultsmentioning
confidence: 92%
“…[8] studied the adaptation of a robot manipulator to fuzzy verbal commands using Probabilistic Neural Networks and reported successful learning with a PA-10 redundant manipulator Adaptation of the human to the robot capabilities is less studied. For example [9] compared using predefined gestures Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, Technische Universität München, Munich, Germany, August 1-3, 2008 and a joystick to control a miniature robot in the same environment used in the experiment reported in this paper and found that the gesture based interface was more efficient and reduced the average task completion time compared with the joystick interface. Their experiment also showed a form of human adaptation to the robot capabilities.…”
Section: Introductionmentioning
confidence: 92%
“…For example, the authors in [9] use a finite state machine to develop a speech interaction system. Conversely, the choice of the authors in [10] has fallen on a gesture-based human-robot interface. In this second case, however, the limited set of gestures (only five) limits the usability of the language.…”
Section: State Of the Artmentioning
confidence: 99%