The social interaction is one of the necessary skills for social robots to better integrate into human society. However, current social robots interact mainly through audio and visual means with little reliance on haptic interaction. There still exist many obstacles for social robots to interact through touch: 1) the complex manufacturing process of the tactile sensor array is the main obstacle to lowering the cost of production; 2) the haptic interaction mode is complex and diverse. There are no social robot interaction standards and data sets for tactile interactive behavior in the public domain. In view of this, our research looks into the following aspects of tactile perception system: 1) Development of low-cost tactile sensor array, including sensor principle, simulation, manufacture, front-end electronics, examination, then applied to the social robot's whole body; 2) Establishment of the tactile interactive model and an event-triggered perception model in a social interactive application for the social robot, then design preprocessing and classification algorithm. In this research, we use k-nearest neighbors, tree, support vector machine and other classification algorithms to classify touch behaviors into six different classes. In particular, the cosine k-nearest neighbors and quadratic support vector machine achieve an overall mean accuracy rate of more than 68%, with an individual accuracy rate of more than 80%. In short, our research provides new directions in achieving lowcost intelligent touch interaction for social robots in a real environment. The low-cost tactile sensor array solution and interactive models are expected to be applied to social robots on a large scale.
Computational models of emotions can not only improve the effectiveness and efficiency of human-robot interaction but also coordinate a robot to adapt to its environment better. When designing computational models of emotions for socially interactive robots, especially for robots for people with special needs such as autistic children, one should take into account the social and communicative characteristics of such groups of people. This article presents a novel computational model of emotions called AppraisalCloudPCT that is suitable for socially interactive robots that can be adopted in autistic rehabilitation which, to the best of our knowledge, is the first computational model of emotions built for robots that can satisfy the needs of a special group of people such as autistic children. To begin with, some fundamental and notable computational models of emotions (e.g., OCC, Scherer’s appraisal theory, PAD) that have deep and profound influence on building some significant models (e.g., PRESENCE, iGrace, xEmotion) for socially interactive robots are revisited. Then, a comparative assessment between our AppraisalCloudPCT and other five significant models for socially interactive robots is conducted. Great efforts have been made in building our proposed model to meet all of the six criteria for comparison, by adopting the appraisal theories on emotions, perceptual control theory on emotions, a component model view of appraisal models, and cloud robotics. Details of how to implement our model in a socially interactive robot we developed for autistic rehabilitation are also elaborated in this article. Future studies should examine how our model performs in different robots and also in more interactive scenarios.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.