Precise, reliable, and low-cost vehicular localization across a continuous spatiotemporal domain is an important problem in the field of outdoor ground vehicles. This paper proposes a visual odometry algorithm, where an ultrarobust and fast feature-matching scheme is combined with an effective antiblurring frame selection strategy. Our method follows the procedure of finding feature correspondences from consecutive frames and minimizing their reprojection error. The blurred image is a great challenge for localization with a sharp turn or fast movement. So we attempt to mitigate the impact of blur with an image singular value decomposition antiblurring algorithm. Moreover, a statistic filter of feature space displacement and circle matching are proposed to screen or prune potential matching features, so as to remove the outliers caused by mismatching. An evaluation of benchmark dataset KITTI and real outdoor data, with blur, low texture, and illumination change, demonstrates that the proposed ego-motion scheme significantly achieved performance with respect to the other state-of-the-art visual odometry approaches to a certain extent.