Understanding intentions is a complex social-cognitive task for humans, let alone machines. In this paper we discuss how the developing field of Social Signal Processing, and assessing social cues to interpret social signals, may help to develop a foundation for robotic social intelligence. We describe a taxonomy to further R&D in HRI and facilitate natural interactions between humans and robots. This is based upon an interdisciplinary framework developed to integrate: (1) the sensors used for detecting social cues, (2) the parameters for differentiating and classifying differing levels of those cues, and (3) how sets of social cues indicate specific social signals. This is necessarily an iterative process, as technologies improve and social science researchers better understand the complex interactions of vast quantities of social cue combinations. As such, the goal of this paper is to advance a taxonomy of this nature to further stimulate interdisciplinary collaboration in the development of advanced social intelligence that mutually informs areas of robotic perception and intelligence.
Robots are currently utilized by various civilian and military agencies, and are becoming more common in human environments. These machines can vary in form and function, but require an interface supporting naturalistic social interactions. Emotion is a key component of social interaction that conveys states and action tendencies, and standard design protocol is necessary to guide the research and development of emotive display systems so that reliable implementations are supported. This work suggests a framework for conveying emotion based on the analogous physical features of emotive cues and their associations with the dimensions of emotion. Sound, kinesics, and color can be manipulated according to their speed, intensity, regularity, and extent to convey the emotive states of a robot. Combinations of cues can enhance human recognition accuracy of robot emotion, but further research is necessary to understand the extent of these interactions and establish each parameter space.
The research and development of assistive robotic manipulators (ARMs) aims to enhance the upper-extremity daily functioning of individuals with disability. Resources continue to be invested, yet the field still lacks a standard framework to serve as a tool for the functional assessment and performance evaluation of ARMs. A review of the literature lends several suggestions from research in occupational therapy, rehabilitation robotics, and human-robot interaction. Performance assessments are often used during rehabilitation intervention by occupational therapists to evaluate a client's functional performance. Similarly, such assessments should be developed to make predictions regarding how ARM performance in a clinical setting may generalize to task execution throughout daily living. However, ergonomics and environmental differences have largely been ignored in past research. Additional insights from the literature provide suggestions for a common set of coding definitions, and a framework to organize the ad hoc performance measures observed across ARM studies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.