Medical training simulators can provide a safe and controlled environment for medical students to practice their physical examination skills. An important source of information for physicians is the visual feedback of involuntary pain facial expressions in response to physical palpation on an affected area of a patient. However, most existing robotic medical training simulators that can capture physical examination behaviours in real-time cannot display facial expressions and comprise a limited range of patient identities in terms of ethnicity and gender. Together, these limitations restrict the utility of medical training simulators because they do not provide medical students with a representative sample of pain facial expressions and face identities, which could result in biased practices. Further, these limitations restrict the utility of such medical simulators to detect and correct early signs of bias in medical training. Here, for the first time, we present a robotic system that can simulate facial expressions of pain in response to palpations, displayed on a range of patient face identities. We use the unique approach of modelling dynamic pain facial expressions using a data-driven perception-based psychophysical method combined with the visuo-haptic inputs of users performing palpations on a robot medical simulator. Specifically, participants performed palpation actions on the abdomen phantom of a simulated patient, which triggered the real-time display of six pain-related facial Action Units (AUs) on a robotic face (MorphFace), each controlled by two pseudo randomly generated transient parameters: rate of change $$\beta $$ β and activation delay $$\tau $$ τ . Participants then rated the appropriateness of the facial expression displayed in response to their palpations on a 4-point scale from “strongly disagree” to “strongly agree”. Each participant ($$n=16$$ n = 16 , 4 Asian females, 4 Asian males, 4 White females and 4 White males) performed 200 palpation trials on 4 patient identities (Black female, Black male, White female and White male) simulated using MorphFace. Results showed facial expressions rated as most appropriate by all participants comprise a higher rate of change and shorter delay from upper face AUs (around the eyes) to those in the lower face (around the mouth). In contrast, we found that transient parameter values of most appropriate-rated pain facial expressions, palpation forces, and delays between palpation actions varied across participant-simulated patient pairs according to gender and ethnicity. These findings suggest that gender and ethnicity biases affect palpation strategies and the perception of pain facial expressions displayed on MorphFace. We anticipate that our approach will be used to generate physical examination models with diverse patient demographics to reduce erroneous judgments in medical students, and provide focused training to address these errors.
Medical training simulators can provide a safe and controlled environment for medical students to practice their physical examination skills. Visual feedback of involuntary pain expressions in response to physical palpation on an affected area of a patient is an important source of information for physicians. However, most existing robotic medical training simulators that can capture physical examination behaviours in real-time cannot display facial expressions or comprise a limited range of patient identities in terms of ethnicity and gender. Together, these limitations restrict the utility of medical training simulators because they do not provide medical students with a representative diversity both of pain facial expressions and face identities, which could result in biased practice. Further, these limitations restrict the utility of such medical simulators to be used to detect and correct early signs of bias in medical training. Here, for the first time, we present a robotic system that can simulate facial expressions of pain in response to palpations, displayed on a range of patient face identities. We use the unique approach of modelling dynamic pain facial expressions using the data-driven psychophysical method of reverse correlation and incorporating the visuo-haptic interactions of users performing palpation to a robot medical simulator. Specifically, participants performed palpation actions on the abdomen phantom of simulated patients, which triggered the real-time display of 6 pain-related facial Action Units (AUs) on a robotic face (MorphFace), each controlled by two pseudo randomly generated transient parameters: rate of change β and activation delay τ. Participants then rated the appropriateness of the facial expression displayed in response to their palpations on a 4-point scale. Each participant (n = 16, 4 Asian female, 4 Asian male, 4 White female and 4 White male) performed 200 palpation trials on 4 patient identities (Black female, Black male, White female and White male) simulated using MorphFace. Results showed that a gradual decrease of β and increase of τ from upper face AUs (around the eyes) to those in the lower face (around the mouth) is rated to be appropriate by all participants. We found that transient parameter values that generated the appropriate pain facial expressions as rated by participants, palpation forces, and delays between palpation actions varied across gender and ethnicity of participant-simulated patient pairs. These findings suggest that gender and ethnicity biases affect the participants’ palpation strategies and their perception of the pain facial expressions displayed on MorphFace. We anticipate our approach could be utilised to generate physical examination models with diverse patient demographic groups to reduce erroneous judgments in medical students, and provide focused training to address these errors.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.