Lithium ion (Li-ion) battery packs have become the most popular option for powering electric vehicles (EVs). However, they have certain drawbacks, such as high temperatures and potential safety concerns as a result of chemical reactions that occur during their charging and discharging processes. These can cause thermal runaway and sudden deterioration, and therefore, efficient thermal management systems are essential to boost battery life span and overall performance. An electrochemical-thermal (ECT) model for Li-ion batteries and a conjugate heat transfer model for three-dimensional (3D) fluid flow and heat transfer are developed using COMSOL Multiphysics®. These are used within a novel computational fluid dynamics (CFD)-enabled multi-objective optimization approach, which is used to explore the effect of the mini-channel cold plates’ geometrical parameters on key performance metrics (battery maximum temperature (Tmax), pressure drop (∆P), and temperature standard deviation (Tσ)). The performance of two machine learning (ML) surrogate methods, radial basis functions (RBFs) and Gaussian process (GP), is compared. The results indicate that the GP ML approach is the most effective. Global minima for the maximum temperature, temperature standard deviation, and pressure drop (Tmax, Tσ, and ∆P, respectively) are identified using single objective optimization. The third version of the generalized differential evaluation (GDE3) algorithm is then used along with the GP surrogate models to perform multi-objective design optimization (MODO). Pareto fronts are generated to demonstrate the potential trade-offs between Tmax, Tσ, and ∆P. The obtained optimization results show that the maximum temperature dropped from 36.38 to 35.98 °C, the pressure drop dramatically decreased from 782.82 to 487.16 Pa, and the temperature standard deviation decreased from 2.14 to 2.12 K; the corresponding optimum design parameters are the channel width of 8 mm and the horizontal spacing near the cold plate margin of 5 mm.