In this paper, we propose an automatic and efficient method to solve optical and SAR image registration using the improved phase congruency (PC) model. First, evenly-distributed keypoints are extracted from the optical images via the block harris method. Complementary grid points are then selected in image regions with poor structural information and supplemented to the keypoint set. For each keypoint, a robust feature representation that captures the local spatial relationship is proposed based on the improved PC model. Specifically, we propose to use two different PC models, the classic PC and the SAR-PC, to construct features for optical and SAR images respectively. The PC features of several directions are aggregated to construct the feature descriptors, and a similarity metric via the phase correlation of feature descriptors is obtained. The proposed similarity metric can not only find accurate correspondence but also present efficient results without presetting the size of the search region. We compare the proposed method with two baselines and stateof-the-art (SOTA) methods, i.e. OS-SIFT, HOPC, CFOG, in various scenarios. The results show that the proposed method outperforms the baselines and shows comparable performance with SOTA methods in regions with abundant structural information and better performance in regions with less structural information. Moreover, we build a high-resolution optical and SAR image matching dataset, which consists of 10,692 nonoverlapping patch-pairs of 256 × 256 pixels and 1-m resolution. Results of two benchmarks, Siamese deep matching network and conditional Generative Adversarial Networks, show that this dataset is practical and challenging.