Accurately estimating latent heat flux (LE) is crucial for achieving efficiency in irrigation. It is a fundamental component in determining the actual evapotranspiration (ETa), which in turn, quantifies the amount of water lost that needs to be adequately compensated through irrigation. Empirical and physics-based models have extensive input data and site-specific limitations when estimating the LE. In contrast, the emergence of data-driven techniques combined with remote sensing has shown promising results for LE estimation with minimal and easy-to-obtain input data. This paper evaluates two machine learning-based approaches for estimating the LE. The first uses climate data, the Normalized Difference Vegetation Index (NDVI), and Land Surface Temperature (LST), while the second uses climate data combined with raw satellite bands. In-situ data were sourced from a flux station installed in our study area. The data include air temperatures (Ta), global solar radiation (Rg), and measured LE for the period 2015-2018. The study uses Landsat 8 as a remote sensing data source. At first, 12 raw available bands were downloaded. The LST is then derived from thermal bands using the Split Window algorithm (SW) and the NDVI from optical bands. During machine learning modeling, the CatBoost model is fed, trained, and evaluated using the two data combination approaches. Cross-validation of 3-folds gave an average RMSE of 27.54 W.nr2 using the first approach and 27.05 W.nr2 using the second approach. Results raise the question: Do we need additional computational layers when working with remote sensing products combined with machine learning? Future work is to generalize the approach and test it for other applications such as soil moisture retrieval, and yield prediction.