2021
DOI: 10.1177/01622439211053661
|View full text |Cite
|
Sign up to set email alerts
|

Ethics as Discursive Work: The Role of Ethical Framing in the Promissory Future of Data-driven Healthcare Technologies

Abstract: The allure of a “data-driven” future healthcare system continues to seduce many. Increasingly, work in Science & Technology Studies and related fields started to interrogate the saliency of this promissory rhetoric by raising ethical questions concerning epistemology, bias, surveillance, security, and opacity. Less visible is how ethical arguments are used as part of discursive work by various practitioners engaged in data-driven initiatives in healthcare. This article argues for more explicit attention to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 64 publications
2
8
0
Order By: Relevance
“…In this line, it is anticipated that the use of AI-DSSs can ease caregivers from data-intensive analytical tasks, proactively directing their attention to issues and trends in data that may need their attention, and possibly even guiding them towards certain care strategies (see prerequisite 2). These findings align with previous studies that actually position the use of AI as a 'technical fix' to mitigate existing risks related to the remote monitoring of older adults, such as caregivers' potential cognitive overload [29,43]. However, the anticipated utility of proactive AI-DSSs must be carefully balanced against the predominant viewpoint that automation of decision-making in the nursing process should be avoided (prerequisite 2), that AI-DSSs might only be introduced in practice through incremental steps that are aligned with users' evolving trust in, and experience with the use of these systems (prerequisite 3), and that vigilance is required to prevent that caregivers become overly reliant on AI-DSSs and are led astray towards unsuitable care strategies (see also [44,45]).…”
Section: Implications For Research and Practicesupporting
confidence: 89%
See 1 more Smart Citation
“…In this line, it is anticipated that the use of AI-DSSs can ease caregivers from data-intensive analytical tasks, proactively directing their attention to issues and trends in data that may need their attention, and possibly even guiding them towards certain care strategies (see prerequisite 2). These findings align with previous studies that actually position the use of AI as a 'technical fix' to mitigate existing risks related to the remote monitoring of older adults, such as caregivers' potential cognitive overload [29,43]. However, the anticipated utility of proactive AI-DSSs must be carefully balanced against the predominant viewpoint that automation of decision-making in the nursing process should be avoided (prerequisite 2), that AI-DSSs might only be introduced in practice through incremental steps that are aligned with users' evolving trust in, and experience with the use of these systems (prerequisite 3), and that vigilance is required to prevent that caregivers become overly reliant on AI-DSSs and are led astray towards unsuitable care strategies (see also [44,45]).…”
Section: Implications For Research and Practicesupporting
confidence: 89%
“…Hence, we call for technology developers, caregivers using AI-DSSs and other stakeholders -including older adults -to engage in ongoing public discourses (see also [59]) and work together to cohesively address different factors important to the responsible embedding of AI-DSSs in practice. In doing so, we recommend viewing the responsible use of AI-DSSs as a balancing act (eg, [43]). Potential or proven positive and negative impacts could be carefully weighed against each other, or stated differently, trade-offs could be made between the effects of using AI-DSSs on values such as quality of life, autonomy, privacy, transparency and fairness (see also [60]).…”
Section: Responsible Innovation: a Balancing Actmentioning
confidence: 99%
“…The approach we took in this paper moves away from many critical (policy) analyses but also from, for example, normative academic ethics focusing on abstract principles (e.g. ‘fair’, ‘transparent’ and ‘trustworthy’ data analytics) that attempt to solve all sorts of ethical (and, in parallel, legal) issues surrounding data-driven technologies upfront (Stevens, 2021; Wehrens et al, 2021). Instead, we argue for the importance of empirical research to study how data-driven technologies become part of healthcare practices.…”
Section: Discussionmentioning
confidence: 99%
“…For example, one set of principles for ethical AI includes beneficence, nonmaleficence, autonomy, justice, and explicability (Floridi et al, 2018). In the context of a data-driven healthcare system, data ethics has been positioned by Wehrens et al (2021) as a form of discursive work performed by health practitioners as they consider what they ought to do and what is good or worthwhile, while negotiating tensions in data use. This work includes balancing of the different "goods" involved in data use (e.g., scientific, economic, public, and professional), applying ethical "fixes" for data use through institutional policies and methods (such as ethics review boards and anonymization), and collective deliberation.…”
Section: Data Ethics and Governance Culturesmentioning
confidence: 99%
“…Further, the term data ethics has no agreed definition (Hasselbalch, 2019). It has been variously conceptualized as guidelines for the ethical use of data such as professional codes of ethics (Stark & Hoffmann, 2019), structured ways of understanding what is ethical as the basis for gathering and using data, a form of work in data‐driven cultures (e.g., Wehrens et al, 2021), and a social movement geared at redressing power imbalance created by the use of big data (e.g., Hasselbalch & Tranberg, 2016). In the context of data science, Stark and Hoffmann (2019) describe data ethics as involving a series of conversations that “represent an effort to better grapple with the consequences of the language we use for understanding and working with data—‘big’ or otherwise—today, and how our discourses around data cultures shape their material, cultural, and political impact” (Stark & Hoffmann, 2019, p. 3).…”
Section: Data Ethics and Governance Culturesmentioning
confidence: 99%