Optical filters are specialized structures designed to selectively transmit specific regions of the optical spectrum while blocking others. These filters achieve their desired properties using a variety of materials and methods. This work focuses on designing and optimizing multilayer optical filters utilizing Machine Learning (ML) and Deep Learning (DL) techniques. A dataset is created from Finite Difference Time Domain (FDTD) simulations of Germanium (Ge) substrates coated with alumina (Al2O3) or silica (SiO2). The dataset consists of bands 3–5, typical for medium-wave infrared (MWIR) and long-wave infrared (LWIR) bands, and includes reflectance values for wavelengths varying between 3 µm and 12 µm. Six ML algorithms and a DL model, including artificial neural networks (ANN) and convolutional neural networks (CNN), are evaluated to determine the most effective approach for predicting reflectance properties. Bayesian optimization is used to fine-tune the hyperparameters of the DL model, achieving optimum performance. The results show that ML models, particularly decision tree, random forest, and bagging methods, outperform the DL model in predicting reflectance values and provide a valuable reference for designing and fabricating optical thin-film filters.