Photovoltaic (PV) power prediction has a constantly evolving solutions landscape with a myriad of data-driven techniques. Each technique leverages a self-adaptive algorithm that must retrain in intervals, be it each day, week, or season, to avoid the model generalizing poorly because of overfitting, underfitting, or concept drift. This paper aims to improve the generalization capability of PV power predictors such as autoencoders used widely in the industry by introducing feature-enhanced ensemble learning (FEEL) after the feature selection step. This framework uses a combination of nonparametric regression and generalized additive models, and an ensemble of weak regularized multilayer perceptron models. Once trained, the framework can reliably generalize on test data across long time periods without any significant degradation in performance. The proposed framework was validated against the baseline autoencoder-based feature enhancement model on a real PV system from a smart neighborhood in Alabama for September 2019. The FEEL framework performed three times better than the baseline, but when applied to the baseline, its performance improved by two times on average. Furthermore, the framework generalized consistently better than five other feature enhancement strategies. Despite fluctuations in weather, the FEEL framework's R-square score had a range of 8.1%, whereas that of the baseline was 48.3%. The mutual information and Minkowski distance scores attempted to quantify concept and model drift, respectively. These scores show that the FEEL framework generalized the ensemble learning models at least two times better than the baseline across the different test days. These results form the first step toward decentralized intelligence for smart grid applications that could free up resources for other expensive analytics in the field.
INDEX TERMSFeature enhancement, nonparametric regression, ensemble learning, generalized additive model, concept drift, photovoltaic power prediction. Nomenclature Unless otherwise specified, the following nomenclature is applied throughout this paper. Abbreviations AE Autoencoder, also denotes one of the feature enhancement strategies ARIMA Autoregressive integrated moving average ARIMAX ARIMA with extragenous inputs ERM Ensemble of regularized multilayer perceptrons FEEL Feature-enhanced ensemble learning GAM Generalized additive model