2018
DOI: 10.1093/idpl/ipy005
|View full text |Cite
|
Sign up to set email alerts
|

Why the ‘Computer says no’: illustrating big data’s discrimination risk through complex systems science

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…Vulnerable citizens are increasingly transformed into data subjects but have little ability to interrogate how their data is being collected, stored, analyzed, and used (Redden, 2018). We call this datafied marginalization ; where the risks of datafication are borne by data subjects and the benefits enjoyed by controllers (Cinnamon, 2017; Mann & Daly, 2019; Rhoen & Feng, 2018). Data collected, stored, and analyzed by companies and used by the government brings new complexities in terms of ownership, transparency, and accountability that impact social equity.…”
Section: Discussionmentioning
confidence: 99%
“…Vulnerable citizens are increasingly transformed into data subjects but have little ability to interrogate how their data is being collected, stored, analyzed, and used (Redden, 2018). We call this datafied marginalization ; where the risks of datafication are borne by data subjects and the benefits enjoyed by controllers (Cinnamon, 2017; Mann & Daly, 2019; Rhoen & Feng, 2018). Data collected, stored, and analyzed by companies and used by the government brings new complexities in terms of ownership, transparency, and accountability that impact social equity.…”
Section: Discussionmentioning
confidence: 99%