Abstract-Driver's gaze direction is a critical information in understanding driver state. In this paper, we present a distributed camera framework to estimate driver's coarse gaze direction using both head and eye cues. Coarse gaze direction is often sufficient in a number of applications, however, the challenge is to estimate gaze direction robustly in naturalistic realworld driving. Towards this end, we propose gaze-surrogate features estimated from eye region via eyelid and iris analysis. We present a novel iris detection computational framework. We are able to extract proposed features robustly and determine driver's gaze zone effectively. We evaluated the proposed system on a dataset, collected from naturalistic on-road driving in urban streets and freeways. A human expert annotated driver's gaze zone ground truth using information from the driver's eyes and the surrounding context. We conducted two experiments to compare the performance of the gaze zone estimation with and without eye cues. The head-alone experiment has a reasonably good result for most of the gaze zones with an overall 79.8% of weighted accuracy. By adding eye cues, the experimental result shows that the overall weighted accuracy is boosted to 94.9%, and all the individual gaze zones have a better true detection rate especially between the adjacent zones. Therefore, our experimental evaluations show efficacy of the proposed features and very promising results for robust gaze zone estimation.