2023
DOI: 10.1016/j.ccc.2023.02.005
|View full text |Cite
|
Sign up to set email alerts
|

Critical Bias in Critical Care Devices

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
4

Relationship

3
7

Authors

Journals

citations
Cited by 20 publications
(13 citation statements)
references
References 90 publications
0
13
0
Order By: Relevance
“…The algorithmic bias in OPE is a trade-off with variation but is inherent to this evaluation method (3). Furthermore, in state-spaces derived from electronic health record data, biases arising from medical devices and social determinants of care can result in inaccurate agent recommendations, stemming from misrepresented states, comparable to supervised machine learning (59). These data could in the future be incorporated in the state-space if also the clinician's notes are incorporated in the dataset (60).…”
Section: Discussionmentioning
confidence: 99%
“…The algorithmic bias in OPE is a trade-off with variation but is inherent to this evaluation method (3). Furthermore, in state-spaces derived from electronic health record data, biases arising from medical devices and social determinants of care can result in inaccurate agent recommendations, stemming from misrepresented states, comparable to supervised machine learning (59). These data could in the future be incorporated in the state-space if also the clinician's notes are incorporated in the dataset (60).…”
Section: Discussionmentioning
confidence: 99%
“…Specifically, pulse oximeters exhibit reduced accuracy in patients with darker skin pigmentation, an issue attributed to device miscalibration and the lack of diversity in development phases [106,116,127,133]. Similar disparities in device accuracy affecting measurements like oxygen saturation, body temperature, and blood pressure [44] have been documented, often resulting from insufficiently diverse calibration populations [26]. A systematic review of mechanical ventilation studies found that AI applied to mechanical ventilation has limited external validation and model calibration, with a substantial risk of bias, significant gaps in reporting, and poor code and data availability [53] Such discrepancies in device performance can introduce biases into clinical data, potentially influencing treatment decisions, such as the administration of supplemental oxygen or the preference for certain temperature measurement methods, thereby affecting diagnoses and treatments for specific racial subgroups.…”
Section: Data Bias In Medical Devices and Algorithmsmentioning
confidence: 97%
“…Pulse oximetry is a prominent example of how racial and ethnic bias can manifest in critical care medical equipment [46]. Underperformance of the pulse oximeter in patients with darker skin color has been shown to result in events of hidden hypoxemia, which can be defined as SaO 2 (measured by arterial blood gas [ABG]) < 88%, but SpO 2 (measured by pulse oximetry) ≥ 92% [45, 46].…”
Section: Example Case Studymentioning
confidence: 99%