Federated learning is designed to collaboratively train a shared model based on a large number of mobile devices while preserving data privacy, which has been widely adopted to support different geo-spatial systems. However, two critical issues prevent federated learning to be effectively deployed on resource-constrained devices in large scale. First, federated learning causes high energy consumption which can badly hurt the battery lifetime of mobile devices. Second, leakage of sensitive personal information still occurs during the training process. Thus, a system that can effectively protect the sensitive information while improving the energy efficiency is urgently required for a mobile-based federated learning system. This paper proposes SmartDL, an energy-aware decremental learning framework that well balances the energy efficiency and data privacy in an efficient manner. SmartDL improves the energy efficiency from two levels: (1) global layer, which adopts an optimization approach to select a subset of participating devices with sufficient capacity and maximum reward. (2) local layer, which adopts a novel decremental learning algorithm to actively provides the decremental and incremental updates, and can adaptively tune the local DVFS at the same time. We prototyped SmartDL on physical testbed and evaluated its performance using several learning benchmarks with real-world traces. The evaluation results show that compared with the original federated learning, SmartDL can reduce energy consumption by 75.6-82.4% in different datasets. Moreover, SmartDL achieves a speedup of 2-4 orders of magnitude in model convergence while ensuring the accuracy of the model.