Cognitive workload recognition is pivotal to maintain the operator's health and prevent accidents in the human-robot interaction condition. So far, the focus of workload research is mostly restricted to a single task, yet cross-task cognitive workload recognition has remained a challenge. Furthermore, when extending to a new workload condition, the discrepancy of electroencephalogram (EEG) signals across various cognitive tasks limits the generalization of the existed model. To tackle this problem, we propose to construct the EEG-based cross-task cognitive workload recognition models using domain adaptation methods in a leave-one-task-out cross-validation setting, where we view any task of each subject as a domain. Specifically, we first design a fine-grained workload paradigm including working memory and mathematic addition tasks. Then, we explore four domain adaptation methods to bridge the discrepancy between the two different tasks. Finally, based on the supporting vector machine classifier, we conduct experiments to classify the low and high workload levels on a private EEG dataset. Experimental results demonstrate that our proposed task transfer framework outperforms the non-transfer classifier with improvements of 3% to 8% in terms of mean accuracy, and the transfer joint matching (TJM) consistently achieves the best performance.
With increasing appealing to privacy issues in face recognition, federated learning has emerged as one of the most prevalent approaches to study the unconstrained face recognition problem with private decentralized data. However, conventional decentralized federated algorithm sharing whole parameters of networks among clients suffers from privacy leakage in face recognition scene. In this work, we introduce a framework, FedGC, to tackle federated learning for face recognition and guarantees higher privacy. We explore a novel idea of correcting gradients from the perspective of backward propagation and propose a softmax-based regularizer to correct gradients of class embeddings by precisely injecting a cross-client gradient term. Theoretically, we show that FedGC constitutes a valid loss function similar to standard softmax. Extensive experiments have been conducted to validate the superiority of FedGC which can match the performance of conventional centralized methods utilizing full training dataset on several popular benchmark datasets.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.