Forecasting and anomaly detection for energy time series is emerging as an important application area for computational intelligence and learning algorithms. The training of robust data-driven models relies on large measurement datasets sampled at ever increasing rates and demand large computational and storage resources for off-line power quality analysis and on-line control in energy management schemes. We analyze the impact of the reporting rate of energy measurements on deep learning based forecasting models in a residential scenario. The work is also motivated by the development of embedded energy gateways for online inference and anomaly detection that avoids the dependence on costly, high-latency, cloud systems for data storage and algorithm evaluation. This in turn requires increased local computation and memory requirements to generate predictions within the control sampling period. We report quantitative forecasting metrics (MSE, MAE, MAPE) to establish an empirical trade-off between reporting rate and model accuracy. Additional results consider the rate-variable feature extraction using a time series data mining algorithm for multi-scale analytics.