Manufacturing challenges are increasing the demands for more agile and dexterous means of production. At the same time, these systems aim to maintain or even increase productivity. The challenges risen from these developments can be tackled through Human-Robot Collaboration (HRC). HRC requires effective task distribution according to each parties’ distinctive strengths, which is envisioned to generate synergetic effects. To enable a seamless collaboration, the human and robot require a mutual awareness, which is challenging, due to the human and robot “speaking” different languages as in analogue and digital. Thus, this challenge can be addressed by equipping the robot with a model of the human. Despite a range of models being available, data-driven models of the human are still at an early stage. This paper proposes an adaptive human sensor framework, which incorporates objective, subjective, and physiological metrics, as well as associated Machine Learning. Thus, it is envisioned to adapt to the uniqueness and dynamic nature of human behavior. To test the framework, a validation experiment was performed, including 18 participants, which aims to predict Perceived Workload during two scenarios, namely a manual and an HRC assembly task. Perceived Workloads are described to have a substantial impact on a human operator’s task performance. Throughout the experiment physiological data from an electroencephalogram (EEG), an electrocardiogram (ECG), and respiration sensor was collected and interpreted. For subjective metrics, the standardized NASA Task Load Index was used. Objective metrics included task completion time and number of errors/assistance requests. Overall, the framework revealed a promising potential towards an adaptive behavior, which is ultimately envisioned to enable a more effective HRC.