Assistive robotic systems could be a suitable solution to support a variety of health and care services, help independent living, and even simulate affection, to reduce loneliness. However, adoption is limited by several issues, as well as user concerns about ethics, data security, and privacy. Other than the common threats related to internet connectivity, personal robotic systems have advanced interaction possibilities, such as audio, video, touch, and gestures, which could be exploited to gain access to private data that are stored in the robot. Therefore, novel, safer methods of interaction should be designed to safeguard users’ privacy. To solicit further research on secure and private multimodal interaction, this article presents a thorough study of the state-of-the-art literature on data security and user privacy in interactive social robotic systems for health and care. In our study, we focus on social robotics to assist older people, which is a global challenge that is receiving a great deal of attention from the robotics and social care communities. This application will have a significant positive impact on the economy and society, but poses various security and privacy issues. This article analyses the key vulnerable areas where data leakage could occur during a multimodal interaction with a personal assistive robotic system. Thus, blockchain with a resource-aware framework, along with a continuous multifactor authentication mechanism, are envisaged as a potential solution for making such systems secure by design; therefore, increasing trust, acceptability, and adoption. Among the key cybersecurity research challenges, it is crucial to create an intelligent mechanism that autonomously determines the right trade-off between continuous user prompts and system usability, according to data types and personal preferences.