Autonomous vehicles (AVs) will soon become a reality with almost every major automotive original equipment manufacturer (OEM) such as Tesla, BMW, Daimler, and big technology giants such as Google's Waymo and Apple Car pushing toward the full self-driving objective. According to the SAE J3016 taxonomy [1] of autonomous driving, there are six levels of automated driving systems ranging from level 0 (completely manual) to level 5 (full selfdriving [FSD]) automated systems that are expected to function in all geographic locations, all weather conditions, and under all conditions. The proposed benefits of intelligent vehicles include fewer road accidents, enhanced safety, reduced traffic congestion, effective commute time usage, and importantly enjoyable and comfortable ride. With increased autonomy, the drivers also assume the role of mere passengers who are engaged in nondriving activities and are unavailable to participate in traffic interactions. This would increase complexity in a mixed-autonomy traffic environment as interactions with pedestrians and cyclists are based on visual cues with the driver. Therefore, intelligent vehicles also need to autonomously interact with other traffic participants such as pedestrians and cyclists and other vehicles.Human-vehicle interaction (HVI) is closely related to the field of human-robot interaction (HRI). It entails the problem of understanding and shaping the interaction dynamics between humans and vehicles. Particularly, the domain of interaction involves the study of sensation, perception, information exchange, inference, and decision-making. [2] With increasing levels of automation, the driver will have more time and choice to perform various tasks other than driving and this opens new avenues for interaction. As a result, a growing number of sensors are being incorporated into the vehicles to understand the driver's and/or passenger's actions, emotions, and personal choices to offer the precise functionalities and services for an enjoyable ride. [3] Figure 1 shows the schematic of in-vehicle interaction consisting of various interactive interfaces and sensing modalities present in the vehicle as well as some modes of interaction such as gestures, speech, and eye gaze.The advances in HVI have been presented in previous review articles, which have mainly focused on the detection of particular characteristics for driving assistance such as driver attention detection, [4] emotion recognition, [5] drowsiness detection, [6] mental workload, [7] and so on. A few other articles have reviewed the interior designs of AVs to better support and interact with drivers and passengers. [8,9] Likewise, the user interfaces (UIs) and user experience (UX) of vehicle interiors have been studied in context with manual as well as automated driving. [10] A number of previous works have also studied the interaction with the external