2023
DOI: 10.1007/978-3-031-22982-4_3
|View full text |Cite
|
Sign up to set email alerts
|

Data-Driven. Privacy, Surveillance, Manipulation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 27 publications
0
1
0
Order By: Relevance
“…In parallel, a key requirement is to control (and clean up) datasets, fight disinformation and foster truly diverse, inclusive and reliable content (Georgieva et al 2022). These transformations support benefits and trigger uncertainties, especially as the above-mentioned driver functions reveal challenges of socio-technical issues with inequalities (Holton and Boyd 2021), system biases (Rawat and Vadivu 2022), cultural-economic colonization (Bell 2018), data colonialism (Couldry, and Mejias 2019), data-driven surveillance with privacy issues (Fossa 2023) and political destabilization through fake campaigns (Borsci et al 2023). The interactivity and virality of personal assistants, virtual influencers, AI-produced content and art, or deepfakes leads to fundamental questions such as how information sources can be evaluated or how they will add value to an existing ICM process or service.…”
Section: Theoretical Frameworkmentioning
confidence: 99%
“…In parallel, a key requirement is to control (and clean up) datasets, fight disinformation and foster truly diverse, inclusive and reliable content (Georgieva et al 2022). These transformations support benefits and trigger uncertainties, especially as the above-mentioned driver functions reveal challenges of socio-technical issues with inequalities (Holton and Boyd 2021), system biases (Rawat and Vadivu 2022), cultural-economic colonization (Bell 2018), data colonialism (Couldry, and Mejias 2019), data-driven surveillance with privacy issues (Fossa 2023) and political destabilization through fake campaigns (Borsci et al 2023). The interactivity and virality of personal assistants, virtual influencers, AI-produced content and art, or deepfakes leads to fundamental questions such as how information sources can be evaluated or how they will add value to an existing ICM process or service.…”
Section: Theoretical Frameworkmentioning
confidence: 99%