Unmanned Remotely Operated Vehicles (ROVs) are widely used in safety‐critical missions, spanning across monitoring, surveillance, and search and rescue missions. In most cases, the human operator controls the ROV under stressful conditions and in harsh environments. The physical and mental states of the operator are vital as they may cause involuntary movements, issuing unintentional commands due to fatigue or stress. Consequently, a joint monitoring mechanism that will assess the operator's commands can be beneficial in preventing potentially catastrophic events, such as accidents. To this end, in this study, we propose a data collection framework for creating a fused dataset, consisting of data obtained from both the human operator and the ROV, and we integrate stress induction to stimulate stress and fatigue that may affect operators during missions. We optimise the dataset, which includes the operator's physiological signals and inertial measurement unit (IMU) sensor data from the ROV, using data fusion approaches and feature extraction. We evaluate the dataset through the statistical analysis, illustrating a statistical difference between data characterising normal and involuntary commands. Finally, we evaluate the optimised and original datasets over a variety of classifiers, demonstrating an accuracy of 94% and 95%, respectively.