The combined effects of aqueous corrosion, stress factors, and seeded cracks on leakage in cast iron pipes have not been thoroughly examined due to the complexity and difficulty in predicting their interactions. This study seeks to address this gap by investigating the interdependencies between corrosion, stress, and cracks in cast iron pipes to optimise the material selection and design in corrosive environments. Leakage experiments were conducted under simulated localised corrosive conditions and internal pressure, revealing that leakage increased from 0 to 25 mL with crack sizes of 0.5 mm, 0.8 mm, 1 mm, and 1.2 mm, along with corrosion times of 0, 120, 160, and 200 h, and varying stress levels. An empirical model was developed using a curve-fitting approach to map the relationships among corrosion time, crack propagation, and leakage amount. The results demonstrate that the interaction between corrosion, stress, and crack propagation was complex and nonlinear, and the leakage amount increased from 0.7 to 0.10 mm every 15 min, as evidenced by SEM microstructure images and empirical data.