2020
DOI: 10.1177/1461444820925811
|View full text |Cite
|
Sign up to set email alerts
|

Anticipating and addressing the ethical implications of deepfakes in the context of elections

Abstract: New media synthesis technologies are rapidly advancing and becoming more accessible, allowing users to make video and audio clips (i.e. deepfakes) of individuals doing and saying things they never did or said. Deepfakes have significant implications for the integrity of many social domains including that of elections. Focusing on the 2020 US presidential election and using an anticipatory approach, this article examines the ethical issues raised by deepfakes and discusses strategies for addressing these issues… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
27
0
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 84 publications
(29 citation statements)
references
References 33 publications
0
27
0
2
Order By: Relevance
“…The present study was guided by media and information studies theories and methods. The discourse around video and image manipulation has fomented panic about the use of technology in national and state politics, for example, elections (Diakopoulos & Johnson, 2021). The present study suggests that the more common risks have disproportionately affected women, people of color, LGBTQIA individuals, and those fighting established power (Maddocks, 2020;Paris & Donovan, 2019).…”
Section: Literature Reviewmentioning
confidence: 64%
See 1 more Smart Citation
“…The present study was guided by media and information studies theories and methods. The discourse around video and image manipulation has fomented panic about the use of technology in national and state politics, for example, elections (Diakopoulos & Johnson, 2021). The present study suggests that the more common risks have disproportionately affected women, people of color, LGBTQIA individuals, and those fighting established power (Maddocks, 2020;Paris & Donovan, 2019).…”
Section: Literature Reviewmentioning
confidence: 64%
“…For some time, computer graphics systems fed by data systems have been able to parse and generate images, even realistic images, from an existing video. Prior to 2017, this was the province of the major motion picture studios, which used the technology to entertain mass audiences, and computer science research laboratories, which used it to develop computer vision (Bansal et al, 2018;Diakopoulos & Johnson, 2021;Paris & Donovan, 2019;Suwajanakorn et al, 2017;Vaccari & Chadwick, 2020). In 2017, consumer-grade and sometimes free, image manipulation software using machine learning gained public attention as pornographic videos with the faces of famous women grafted onto pornographic actors' bodies appeared on Reddit (Cole, 2017;Maddocks, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…Some worry that the technology opens the door to new kinds of cyber-bullying, sexual harassment and humiliation (Harris, 2019;Öhman, 2019). Others have raised concerns over the ability of synthetic media to ruin people's reputation by planting false beliefs among the public (Diakopoulos & Johnson, 2021, p. 2081 or even within law enforcement (Yadav & Salmani, 2019, p. 4). Relatedly, some stress the political aspects, warning that synthetic audio and video material falsely starring political figures may pollute democratic discourse to the extent that meaningful deliberation becomes impossible, due to the lack of a shared ground truth (Diakopoulos & Johnson, 2021).…”
mentioning
confidence: 99%
“…Others have raised concerns over the ability of synthetic media to ruin people's reputation by planting false beliefs among the public (Diakopoulos & Johnson, 2021, p. 2081 or even within law enforcement (Yadav & Salmani, 2019, p. 4). Relatedly, some stress the political aspects, warning that synthetic audio and video material falsely starring political figures may pollute democratic discourse to the extent that meaningful deliberation becomes impossible, due to the lack of a shared ground truth (Diakopoulos & Johnson, 2021). Some even go as far as to predict an "information apocalypse" (Warzel, 2018) or "epistemic maelstrom" (Rini, 2020, p. 8), where the (mediated) truth becomes impossible to discern through the constant noise of synthetic content.…”
mentioning
confidence: 99%
“…At the same time, it could be argued that the act of circulating a photographic or video replication of an individual's face on someone else's body, even playfully, if done without the individual's permission, is never 100% ethically benign. Diakopoulos and Johnson (2020) call such replication "persona plagiarism," but I prefer the term "persona appropriation" as it allows for different kinds of creative intent and viewer readings beyond that of deception. Nonetheless, as a few commenters on the CDFK video point out, even if Keanu does not lose anything from the video, Corridor Digital (the company behind the channel) is leveraging the promotional power of the star's name and face for their own economic benefit, and their gain is skating a little too close to the unethical line.…”
mentioning
confidence: 99%