“…• Emotional expression of robots: in complex interaction scenarios, such as assistive, educational, and social robotics (Fong et al, 2003;Rossi et al, 2020), the ability of robots to exhibit recognizable emotional expressions strongly impacts the resulting social interaction (Mavridis, 2015). Several studies focused on exploring which modalities (e.g., face expression, body posture, movement, voice) can convey emotional information from robots to humans and how people perceive and recognize emotional states (Tsiourti et al, 2017;Marmpena et al, 2018;Rossi and Ruocco, 2019); • Ability of robots to infer the human emotional state: robots able to infer and interpret human emotions would be more effective in interacting with people. Recent works aim to design algorithms for classifying emotional states from different input modalities, such as facial expression, body language, voice, and physiological signals (McColl et al, 2016;Cavallo et al, 2018).…”