2018
DOI: 10.1080/10253866.2018.1519489
|View full text |Cite
|
Sign up to set email alerts
|

Self-quantification and the datapreneurial consumer identity

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 31 publications
(14 citation statements)
references
References 53 publications
0
14
0
Order By: Relevance
“…Starting once again with the individual LoA: as more diagnostic and therapeutic interventions become based on AI-Health solutions, individuals may be encouraged to share more and more personal data about themselves (Racine et al, 2019) -data that can then be used in opaque ways (Sterckx et al, 2016). This means that the ability for individuals to be meaningfully involved in shared decision making is considerably undermined As a result, the increasing use of algorithmic decision-making in clinical settings can have negative implications for individual autonomy, as for an individual to be able to exert agency over the AI-Health derived clinical decision, they would need to have a good understanding of the underlying data, processes and technical possibilities that were involved in it being reached (DuFault & Schouten, 2018) and be able to ensure their own values are taken into consideration (McDougall, 2019). The vast majority of the population do not have the level of eHealth literacy necessary for this (Kim & Xie, 2017), and those that do (including HCPs) are prevented from gaining this understanding due to the black-box nature of AI-Health algorithms (Watson et al, 2019).…”
Section: Normative Concerns: Unfair Outcomes and Transformative Effectsmentioning
confidence: 99%
“…Starting once again with the individual LoA: as more diagnostic and therapeutic interventions become based on AI-Health solutions, individuals may be encouraged to share more and more personal data about themselves (Racine et al, 2019) -data that can then be used in opaque ways (Sterckx et al, 2016). This means that the ability for individuals to be meaningfully involved in shared decision making is considerably undermined As a result, the increasing use of algorithmic decision-making in clinical settings can have negative implications for individual autonomy, as for an individual to be able to exert agency over the AI-Health derived clinical decision, they would need to have a good understanding of the underlying data, processes and technical possibilities that were involved in it being reached (DuFault & Schouten, 2018) and be able to ensure their own values are taken into consideration (McDougall, 2019). The vast majority of the population do not have the level of eHealth literacy necessary for this (Kim & Xie, 2017), and those that do (including HCPs) are prevented from gaining this understanding due to the black-box nature of AI-Health algorithms (Watson et al, 2019).…”
Section: Normative Concerns: Unfair Outcomes and Transformative Effectsmentioning
confidence: 99%
“…The mood of incompletion that the behavioural surplus project engenders is perhaps best evidenced by previous research that focuses on individuals or groups who self-elect to generate and curate data from their day-to-day activities – a phenomenon that has been referred to as ‘dataist’ lifestyles (DuFault and Schouten, 2020), ‘self-tracking’ (Charitsis et al, 2019), ‘everyday analytics’ (Pantzar and Ruckenstein, 2015), ‘lived informatics’ (Rooksby et al, 2014) or ‘lifelogging’ (Räikkönen and Grénman, 2020). For example, in prior ethnographic engagements with members of the Quantified Self (QS) community – an international collective that shares insights from personal data – we see how self-tracking technologies are welcomed into consumers’ lives to enhance self-knowledge and optimise the self, despite self-trackers’ recognition of surveillance capitalism’s privacy threats (Bode and Kristensen, 2015; Kristensen and Ruckenstein, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…In the case of credit scoring, for example, most work to date has relied on statistical analyses of datasets (Fellowes, 2006; Nelson, 2010) or quantitative surveys (Arya et al, 2013; Levinger et al, 2011). More recently, researchers have turned to more interpretive methods, analyzing discourse in discussion forums (Deville, 2016; DuFault and Schouten, 2020; Mackenzie, 2017) or conducting ethnographic studies of marginalized groups and financial inclusion advocates (Kear, 2017). Beyond that, more mundane forms of sense-making, folk theories, and everyday experiences of credit scoring have remained under-explored, even within scholarship on the Quantified Self movement (DuFault and Schouten, 2020).…”
Section: From Data Systems To Data Subjectsmentioning
confidence: 99%
“…Even Facebook users who interact with personalized ads and news feeds on a daily basis are not necessarily aware that they are being algorithmically targeted (Eslami et al, 2015: 153, 2016; Rader and Gray, 2015: 178). In an analysis of online credit forums, Dufault and Schouten (2020: 302) found that it usually takes a “precipitating incident” or negative event to trigger such awareness. The problem, then, is that any study focusing on the experiences of data subjects will inevitably prompt participants with information that is likely to transform their understanding of their own position vis-à-vis the system.…”
Section: Pitfalls For Empirical Inquirymentioning
confidence: 99%