A series of calcium borosilicate glasses with varying [B 2 O 3 ], [MoO 3 ], and [CaO] were prepared and subjected to 92 MeV Xe ions used to simulate the damage from long-term a-decay in nuclear waste glasses. Modifications to the solubility of molybdenum, the microstructure of separated phases, and the Si-O-B network topology were investigated following five irradiation experiments that achieved doses between 5 9 10 12 and 1.8 9 10 14 Xe ions/cm 2 in order to test the hypotheses of whether irradiation would induce, propagate, or anneal phase separation. Using electron microscopy, EDS analysis, Raman spectroscopy, and XRD, irradiation was observed to increase the integration of MoO 4 2by increasing the structural disorder within and between heterogeneous amorphous phases. This occurred through Si/B-O-Si/B bond breakage and reformation of boroxyl and 3/4-membered SiO 4 rings. De-mixing of the Si-O-B network concurrently enabled cross directional Ca and Mo diffusion along defect created pathways, which were prevalent along the interface between phases. The initiation and extent of these changes was dependent primarily on the [SiO 2 ]/[B 2 O 3 ] ratio, with [MoO 3] having a secondary effect on influencing the defect population with increasing dose. Microstructurally, these changes to bonding caused a reduction in heterogeneities between amorphous phases by reducing the size and increasing the spatial distribution of immiscible droplets. This general increase in structural disorder prevented crystallization in most cases, but where precipitation was initiated by radiation, it was re-amorphized with increasing dose. These outcomes suggest that internal radiation can alter phase separation tie lines, and can therefore be used as a tool to design certain structural environments for long-term encapsulation of radioisotopes.