2016
DOI: 10.1177/1527476416680453
|View full text |Cite
|
Sign up to set email alerts
|

Training Computers to See Internet Pornography: Gender and Sexual Discrimination in Computer Vision Science

Abstract: This article critically examines computer vision–based pornography filtering (CVPF), a subfield in computer science seeking to train computers on how to recognize the difference between digital pornographic images and nonpornographic images. Based on a review of 102 peer-reviewed CVPF articles, we argue that CVPF has as a whole trained computers to “see” a very specific, idealized form of pornography: pictures of lone, thin, naked women. The article supports this argument by closely reading the algorithms prop… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0
2

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 24 publications
(12 citation statements)
references
References 66 publications
(45 reference statements)
0
10
0
2
Order By: Relevance
“…For example, a survey of literature on computer vision systems for detecting pornography found that the task is largely framed around detecting the features of thin, nude, female-presenting bodies with little body hair, largely to the exclusion of other kinds of bodiesthereby implicitly assuming a relatively narrow and conservative view of pornography that happens to align with a straight male gaze. 25 In an examination of the person categories within the Image-Net dataset, 3 Crawford and Paglen 26 uncovered millions of images of people that had been labeled with offensive categories, including racial slurs and derogatory phrases. In a similar vein, Birhane and Prabhu 27 examined a broader swath of image classification datasets that were constructed using the same cate-gorical schema as ImageNet, finding a range of harmful and problematic representations, including non-consensual and pornographic imagery of women.…”
Section: Representational Harms In Datasetsmentioning
confidence: 99%
“…For example, a survey of literature on computer vision systems for detecting pornography found that the task is largely framed around detecting the features of thin, nude, female-presenting bodies with little body hair, largely to the exclusion of other kinds of bodiesthereby implicitly assuming a relatively narrow and conservative view of pornography that happens to align with a straight male gaze. 25 In an examination of the person categories within the Image-Net dataset, 3 Crawford and Paglen 26 uncovered millions of images of people that had been labeled with offensive categories, including racial slurs and derogatory phrases. In a similar vein, Birhane and Prabhu 27 examined a broader swath of image classification datasets that were constructed using the same cate-gorical schema as ImageNet, finding a range of harmful and problematic representations, including non-consensual and pornographic imagery of women.…”
Section: Representational Harms In Datasetsmentioning
confidence: 99%
“…But this data is an artifact of the policies and judgments of the platform's existing moderation arrangements. An effective tool may learn to make the same kinds of distinctions as before (Binns et al, 2017;Gehl et al, 2017). But while consistency might sound like a good thing, these policies should actually adapt over time (Sinnreich, 2018).…”
Section: The Pitfalls Of Automating Moderationmentioning
confidence: 99%
“…Les seins sont ainsi plus facilement détectés en tant que « contenus sexuels offensants » que les pénis. Cela nous renvoie au problème plus large du biais hétérosexuel masculin blanc dans la conception des logiciels informatiques [Gehl et al, 2017].…”
Section: Comment Les Plateformes S'y Prennent-elles Pour éViter Que Les Contenus Catégorisés Comme « Nsfw » Apparaissent Sur Nos éCrans ?unclassified