The application of solar energy as a renewable energy source has significantly escalated owing to its abundance and availability worldwide. However, the intermittent behavior of solar irradiance is a serious disadvantage for electricity grids using photovoltaic (PV) systems. Thus, reliable solar irradiance data are vital to achieve consistent energy production. Geostationary satellite images have become a solution to this issue, as they represent a database for solar irradiance on a massive spatiotemporal scale. The estimation of global horizontal irradiance (GHI) using satellite images has been developed based on physical and semi-empirical models, but only a few studies have been dedicated to modeling GHI using semi-empirical models in Korea. Therefore, this study conducted a comparative analysis to determine the most suitable semi-empirical model of GHI in Korea. Considering their applicability, the Beyer, Rigollier, Hammer, and Perez, models were selected to estimate the GHI over Seoul, Korea. After a comparative evaluation, the Hammer model was determined to be the best model. This study also introduced a hybrid model and applied a long short-term memory (LSTM) model in order to improve prediction accuracy. The hybrid model exhibited a smaller root-mean-square error (RMSE), 97.08 W/m2, than that of the Hammer model, 103.92 W/m2, while producing a comparable mean-bias error. Meanwhile, the LSTM model showed the potential to further reduce the RMSE by 11.2%, compared to the hybrid model.
Solar irradiance models contribute to mitigating the lack of measurement data at a ground station. Conventionally, the models relied on physical calculations or empirical correlations. Recently, machine learning as a sophisticated statistical method has gained popularity due to its accuracy and potential. While some studies compared machine learning models with other models, a study has not yet been performed that compares them side-by-side to assess their performance using the same datasets in different locations. Therefore, this study aims to evaluate the accuracy of three representative models for estimating solar irradiance using atmospheric variables measurement and cloud amount derived from satellite images as the input parameters. Based on its applicability and performance, this study selected the fast all-sky radiation model for solar applications (FARMS) derived from the radiative transfer approach, the Hammer model that simplified atmospheric correlation, and the long short-term memory (LSTM) model specialized in sequential datasets. Global horizontal irradiance (GHI) data were modeled for five distinct locations in South Korea and compared with hourly measurement data of two years to yield the error metrics. When identical input parameters were used, LSTM outperformed the FARMS and the Hammer model in terms of relative root mean square difference (rRMSD) and relative mean bias difference (rMBD). Training an LSTM model using the input parameters of FARMS, such as ozone, nitrogen, and precipitable water, yielded more accurate results than using the Hammer model. The result shows unbiased and accurate estimation with an rRMSD and rMBD of 23.72% and 0.14%, respectively. Conversely, the FARMS has a faster processing speed and does not require significant data to make a fair estimation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.