RGB-Infrared person re-IDentification (re-ID) aims to match RGB and infrared (IR) images of the same person. However, the modality discrepancy between RGB and IR images poses a significant challenge for re-ID. To address this issue, this paper proposes a Proxy-based Embedding Alignment (PEA) method to align the RGB and IR modalities in the embedding space. PEA introduces modality-specific identity proxies and leverages the sample-to-proxy relations to learn the model. Specifically, PEA focuses on three types of alignments: intra-modality alignment, inter-modality alignment, and cycle alignment. Intra-modality alignment aims to align sample features and proxies of the same identity within a modality. Inter-modality alignment aims to align sample features and proxies of the same identity across different modalities. Cycle alignment requires that a proxy is aligned with itself after tracing it along a cross-modality cycle (e.g., IR→RGB→IR). By integrating these alignments into the training process, PEA effectively mitigates the impact of modality discrepancy and learns discriminative features across modalities. We conduct extensive experiments on several RGB-IR re-ID datasets, and the results show that PEA outperforms current state-of-the-art methods.Notably, on SYSU-MM01 dataset, PEA achieves 71.0% mAP under the multi-shot setting of the indoor-search protocol, surpassing the best-performing method by 7.2%.