In recent years, contrastive learning has gained widespread adoption in machine learning applications to physical systems primarily due to its distinctive cross-modal capabilities and scalability. Building on the foundation of Kolmogorov−Arnold Networks (KANs) [Liu, Z. et al. Kan: Kolmogorov-arnold networks. arXiv 2024, 2404, we introduce a novel contrastive learning framework, Kolmogorov−Arnold Contrastive Crystal Property Pretraining (KCCP), which integrates the principles of CLIP and KAN to establish robust correlations between crystal structures and their physical properties. During the training process, we conducted a comparative analysis between Multilayer Perceptron (MLP) and KAN, revealing that KAN significantly outperforms MLP in both accuracy and convergence speed for this task. By extending the capabilities of contrastive learning to the realm of physical systems, KCCP offers a promising approach for constructing cross-data structural and cross-modal physical models, representing an area of considerable potential.