Federated Learning (FL) enables the deployment of distributed machine learning models over the cloud and Edge Devices (EDs) while preserving the privacy of sensitive local data, such as electronic health records. However, despite FL advantages regarding security and flexibility, current constructions still suffer from some limitations. Namely, heavy computation overhead on limited resources EDs, communication overhead in uploading converged local models' parameters to a centralized server for parameters aggregation, and lack of guaranteeing the acquired knowledge preservation in the face of incremental learning over new local data sets. This paper introduces a secure and resourcefriendly protocol for parameters aggregation in federated incremental learning and its applications. In this study, the central server relies on a new method for parameters aggregation called orthogonal gradient aggregation. Such a method assumes constant changes of each local data set and allows updating parameters in the orthogonal direction of previous parameters spaces.As a result, our new construction is robust against catastrophic forgetting, maintains the federated neural network accuracy, and is efficient in computation and
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.