Line intensity ratios (LIRs) of helium (He) atoms are known to depend on electron density, $n_{\rm e}$, and temperature, $T_{\rm e}$, and thus are widely utilized to evaluate these parameters, which is the so-called He I LIR method. In this conventional method, measured LIRs are compared with theoretical values calculated using a collisional-radiative (CR) model to find the best possible $n_{\rm e}$ and $T_{\rm e}$. Basic CR models have been improved to take into account several effects. For instance, radiation trapping can occur to a significant degree in weakly ionized plasmas, leading to major alterations of LIRs. This effect has been included with optical escape factors in CR models. A new approach to the evaluation of $n_{\rm e}$ and $T_{\rm e}$ from He I LIRs has recently been explored using machine learning (ML). In the ML-aided LIR method, a predictive model is developed with training data, which consist of input (measured LIRs) and desired/known output (measured $n_{\rm e}$ or $T_{\rm e}$ from other diagnostics). It has been demonstrated that this new method predicts $n_{\rm e}$ and $T_{\rm e}$ better than using the conventional method coupled with a CR model, not only for He but also for other species.