High levels of cognitive workload decreases human's performance and leads to failures with catastrophic outcomes in risky missions. Today, reliable cognitive workload detection presents a common major challenge, since the workload is not directly observable. However, cognitive workload affects several physiological signals that can be measured noninvasively. The main goal of this work is to develop a reliable machine learning algorithm to identify the cognitive workload induced during rescue missions, which is evaluated through drone control simulation experiments. In addition, we aim to minimize the computing resources usage while maximizing the cognitive workload detection accuracy for a reliable real-time operation. We perform an experiment in which 24 subjects played a rescue mission simulator while respiration, electrocardiogram, photoplethysmogram, and skin temperature signals were measured. State-of-the-art feature-based machine learning algorithms are investigated for cognitive workload characterization using learning curves, data augmentation, and cross-validation techniques. The best classification algorithm is selected, optimized, and the most informative features are selected. Finally, the generalization power of the optimized model is evaluated on an unseen test set. We obtain an accuracy level of 86% on the new unseen datasets using the proposed and optimized eXtreme Gradient Boosting (XGB) algorithm. Then, we reduce the complexity of the machine learning model for future implementation on resource-constrained wearable embedded systems, by optimizing the model and selecting the 26 most important features. Overall, a generalizable and lowcomplexity machine learning model for cognitive workload detection based on physiological signals is presented for the first time in the literature.
The use of drones in search and rescue (SAR) missions can be very cognitively demanding. Since high levels of cognitive workload can negatively affect human performance, there is a risk of compromising the mission and leading to failure with catastrophic outcomes. Therefore, cognitive workload monitoring is the key to prevent the rescuers from taking dangerous decisions. Due to the difficulties of gathering data during real SAR missions, we rely on virtual reality. In this work, we use a simulator to induce three levels of cognitive workload related to SAR missions with drones. To detect cognitive workload, we extract features from different physiological signals, such as electrocardiogram, respiration, skin temperature, and photoplethysmography. We propose a recursive feature elimination method that combines the use of both an eXtreme Gradient Boosting (XGBoost) algorithm and the SHapley Additive exPlanations (SHAP) score to select the more representative features. Moreover, we address both a binary and a three-class detection approaches. To this aim, we investigate the use of different machine-learning algorithms, such as XGBoost, random forest, decision tree, k-nearest neighbors, logistic regression, linear discriminant analysis, gaussian naïve bayes, and support vector machine. Our results show that on an unseen test set extracted from 24 volunteers, an XGBoost with 24 features for discrimination reaches an accuracy of 80.2% and 62.9% in order to detect two and three levels of cognitive workload, respectively. Finally, our results are open the doors to a fine grained cognitive workload detection in the field of SAR missions.
Objective: Today, stress monitoring on wearable devices is challenged by the tension between high-detection accuracy and battery lifetime driven by multimodal data acquisition and processing. Limited research has addressed the classification cost on multimodal wearable sensors, particularly when the features are cost-dependent. Thus, we design a Cost-Aware Feature Selection (CAFS) methodology that trades-off between prediction-power and energy-cost for multimodal stress monitoring. Methods: CAFS selects the most important features under different energy-constraints, which allows us to obtain energy-scalable stress monitoring models. We further propose a self-aware stress monitoring method that intelligently switches among the energy-scalable models, reducing energy consumption. Results: Using CAFS methodology on experimental data and simulation, we reduce the energy-cost of the stress model designed without energy constraints up to 94.37%. We obtain 90.98% and 95.74% as the best accuracy and confidence values, respectively, on unseen data, outperforming state-of-the-art studies. Analyzing our interpretable and energy-scalable models, we showed that simple models using only heart rate (HR) or skin conductance level (SCL), confidently predict acute stress for HR > 93.30BP M and non-stress for SCL < 6.42µS, but, outside these values, a multimodal model using respiration and pulse wave's features is needed for confident classification. Our self-aware acute stress monitoring proposal saves 10x energy and provides 88.72% of accuracy on unseen data. Conclusion: We propose a comprehensive solution for the cost-aware acute stress monitoring design addressing the problem of selecting an optimized feature subset considering their cost-dependency and cost-constraints. Significant: Our design framework enables longterm and confident acute stress monitoring on wearable devices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.