Previous research has highlighted age-related differences in social perception, in particular emotional expression processing. To date, such studies have largely focused on approaches that use static emotional stimuli that the participant has to identify passively without the possibility of any interaction. In this study, we propose an interactive virtual environment to better address age-related variations in social and emotional perception. A group of 22 young (18–30 years) and 20 older (60–80 years) adults were engaged in a face-to-face conversation with an embodied conversational agent. Participants were invited to interact naturally with the agent and to identify his facial expression. Their gaze behaviour was captured by an eye-tracking device throughout the interaction. We also explored whether the Big Five personality traits (particularly extraversion) and anxiety modulated gaze during the social interaction. Findings suggested that age-related differences in gaze behaviour were only apparent when decoding social signals (i.e., listening to a partner’s question, identifying facial expressions) and not when communicating social information (i.e., when speaking). Furthermore, higher extraversion levels consistently led to a shorter amount of time gazing towards the eyes, whereas higher anxiety levels led to slight modulations of gaze only when participants were listening to questions. Face-to-face conversation with virtual agents can provide a more naturalistic framework for the assessment of online socio-emotional interaction in older adults, which is not easily observable in classical offline paradigms. This study provides novel and important insights into the specific circumstances in which older adults may experience difficulties in social interactions.