Emotion and a broader range of affective and cognitive states play an important role on the road. While this has been predominantly investigated in terms of driver safety, the approaching advent of autonomous vehicles (AVs) is expected to bring a fundamental shift in focus for emotion recognition in the car, from the driver to the passengers. This work presents a number of affect-enabled applications, including adapting the driving style for an emotional experience or tailoring the infotainment to personal preferences. It attempts to foresee upcoming challenges and provides suggestions for multimodal affect modelling, with a focus on the audio and visual modalities. In particular, this includes context awareness, reliable diarisation of multiple passengers, group affect, and personalisation. Finally, we provide some recommendations on future research directions, including explainability, privacy, and holistic modelling.