2019 IEEE International Conference on Systems, Man and Cybernetics (SMC) 2019
DOI: 10.1109/smc.2019.8914593
|View full text |Cite
|
Sign up to set email alerts
|

A Framework for Monitoring Human Physiological Response during Human Robot Collaborative Task

Abstract: In this paper, a framework for monitoring human physiological response during Human-Robot Collaborative (HRC) task is presented. The framework highlights the importance of generation of event markers related to both human and robot, and also synchronization of data collected. This framework enables continuous data collection during an HRC task when changing robot movements as a form of stimuli to invoke a human physiological response. It also presents two case studies based on this framework and a data visuali… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…Similarly, a compliant joint or an arm in the robot has been proved to reduce injury that leads to better confidence of operators (She et al, 2020) . Furthermore, (Savur et al, 2019) added that any potential physical collision with the robot decreases the human trust, and consequently the production benefits of the HRC. On the other hand, (Sauppé & Mutlu, 2015) found that the social features in industrial robots may reduce the safety of the interaction.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Similarly, a compliant joint or an arm in the robot has been proved to reduce injury that leads to better confidence of operators (She et al, 2020) . Furthermore, (Savur et al, 2019) added that any potential physical collision with the robot decreases the human trust, and consequently the production benefits of the HRC. On the other hand, (Sauppé & Mutlu, 2015) found that the social features in industrial robots may reduce the safety of the interaction.…”
Section: Discussionmentioning
confidence: 99%
“…The adaptability of the robot has also been used to improve the acceptance and the trust of the operator. As stated by (Savur et al, 2019) , trust is about managing human expectations; therefore, the authors developed a system to monitor human physiological responses to measure trust and suitably adapting the robot speed, acceleration and trajectory. As seen in (Lasota & Shah, 2015) , improving the acceptance and trust of the operator leads to reduced idle times and improved productivity, similar to improving physical ergonomics.…”
Section: Human Factors and Cobot Capabilitiesmentioning
confidence: 99%
See 1 more Smart Citation
“…Collaborative robots, specifically in tasks where the user and the robot take independent actions that jointly manipulate the environment towards a mutual goal state, also perform environmental assistance. Examples of such systems include collaborative cleaning (Devin and Alami, 2016) and assembly (Savur et al, 2019;Zhao et al, 2020). A robot working collaboratively with a user can improve its efficiency by modeling the user's behavior, for example by determining specific poses to hold an object in to facilitate fluid collaboration during assembly (Akkaladevi et al, 2016) or by anticipating and delivering the next required item in assembly (Hawkins et al, 2013(Hawkins et al, , 2014Maeda et al, 2014) or cooking (Koppula et al, 2016;Milliez et al, 2016), or by providing help under different initiative paradigms during assembly (Baraglia et al, 2016).…”
Section: Environmentmentioning
confidence: 99%
“…Seo et al (2019) used machine learning to predict boredom from electroencephalogram data (EEG). Savur et al (2019) used EEG, Electrocardiogram (ECG), Electromyography (EMG), Galvanic Skin Response (GSR), Heart Rate (HR), Heart Rate Bariability (HRV), and pupil dilation to adjust robot speed within HCR-scenarios. Höfling et al (2020) compared biophysiological measurements with the output of commercially available facial analysis software (FaceReader).…”
Section: Introductionmentioning
confidence: 99%