The Social Power of Algorithms 2019
DOI: 10.4324/9781351200677-8
|View full text |Cite
|
Sign up to set email alerts
|

‘Hypernudge’: Big Data as a mode of regulation by design

Abstract: words max)This paper draws on regulatory governance scholarship to argue that the analytic phenomenon currently known as 'Big Data' can be understood as a mode of 'design-based' regulation. Although Big Data decision-making technologies can take the form of automated decision-making systems, this paper focuses on algorithmic decision-guidance techniques. By highlighting correlations between data items that would not otherwise be observable, these techniques are being used to shape the informational choice cont… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
61
0
7

Year Published

2020
2020
2022
2022

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 49 publications
(68 citation statements)
references
References 18 publications
0
61
0
7
Order By: Relevance
“…Meanwhile, digital firms use a variety of psychological, emotional, and economic incentives, disincentives, and heuristics to repeatedly nudge users and customers to "share" ever more information (Acquisti et al, 2015;Fourcade, 2017;Yeung, 2017). Thaler and Sunstein (2008) use the term "choice architecture" to refer to these techniques.…”
Section: Generalized Reciprocity: Data Sharing Agreementsmentioning
confidence: 99%
“…Meanwhile, digital firms use a variety of psychological, emotional, and economic incentives, disincentives, and heuristics to repeatedly nudge users and customers to "share" ever more information (Acquisti et al, 2015;Fourcade, 2017;Yeung, 2017). Thaler and Sunstein (2008) use the term "choice architecture" to refer to these techniques.…”
Section: Generalized Reciprocity: Data Sharing Agreementsmentioning
confidence: 99%
“…As noted by MacKenzie (2018), the central issue in recent debates about Big Data, privacy, commercial and political targeting, filter bubbles and so on, is how individual can or should be taken into account in the calculation, and with what consequences on society. Personalization simultaneously represents a promise of emancipation from the broad statistical categories of private and public bureaucracies (Cardon, 2015;Th evenot, 2019), and a potential threat to our privacy and freedom of choice, as it often implies surveillance, targeting and nudging (Lyon, 2014;Yeung, 2017).…”
Section: Personalization As a Disputed Moral Groundmentioning
confidence: 99%
“…Which features of the data set are selected as important may be either appropriate or inappropriate for the decision at hand. Attorney General Eric Holder perhaps best summarizes this concern in regard to risk assessment algorithms in criminal justice: I am concerned that they [algorithms used in sentencing] inadvertently undermine our efforts to ensure individualized and equal justice, … Criminal sentences 3 Similarly, algorithms act like design-based regulation (Yeung 2017) where algorithms can be used for the consistent application of legal and regulatory regimes (Thornton 2016(Thornton , p. 1826; algorithms can enforce morality (Diakopoulos 2013)-while still being designed and used by individuals. 4 For algorithms, in addition to directly coding the algorithm to prioritize one group more than any others, two mechanisms can also indirectly drive bias in the process: proxies and machine learning.…”
Section: Reinforcing or Undercutting Ethical Principlesmentioning
confidence: 99%