Abstract. In recent years, automotive active safety systems have become increasingly common in road vehicles since they provide an opportunity to significantly reduce traffic fatalities by active vehicle control. Augmented Reality (AR) applications can enhance intelligent transportation systems by superimposing surrounding traffic information on the users view and keep drivers and pedestrians view on roads. However, due to the existence of a complex environment such as weather conditions, illuminations and geometric distortions, Traffic Sign Recognition(TSR) systems has always been considered as a challenging task. The aim of this paper is to evaluate the effectiveness of AR cues in improving driving safety by deploying an on-board camera-based driver alert system against approaching traffic signs such as stop, speed limit, unique, danger signs, etc. A new approach is presented for marker-less AR-TSR system that superimposes augmented virtual objects onto a real scene under all types of driving situations including unfavorable weather conditions. Our method is composed of both online and offline stages. An intrinsic camera parameter change depending on the zoom values is calibrated. A Haar-like feature with Adaboost has been used to train a Haar detector in the offline stage. Extrinsic camera parameters are then estimated based on homography method in the online stage. With the complete set of camera parameters, virtual objects can be coherently inserted into the video sequence captured by the camera so that synthetic traffic signs may be added to increase safety.