2020
DOI: 10.1609/icwsm.v14i1.7320
|View full text |Cite
|
Sign up to set email alerts
|

Disturbed YouTube for Kids: Characterizing and Detecting Inappropriate Videos Targeting Young Children

Abstract: A large number of the most-subscribed YouTube channels target children of very young age. Hundreds of toddler-oriented channels on YouTube feature inoffensive, well produced, and educational videos. Unfortunately, inappropriate content that targets this demographic is also common. YouTube's algorithmic recommendation system regrettably suggests inappropriate content because some of it mimics or is derived from otherwise appropriate content. Considering the risk for early childhood development, and an increasin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
32
0
2

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 69 publications
(37 citation statements)
references
References 18 publications
3
32
0
2
Order By: Relevance
“…Social media algorithms that prioritize attractively and revenue‐generating content over viewer safety may further contribute to exposure to harmful behaviors, as children and parents have less control over the content they engage in (Radesky et al., 2020). For instance, one study found that videos appropriate for children 1–5 years old lead to an inappropriate video recommendation 3.5% of the time within 10 recommendations (Papadamou et al., 2020). Social media platforms allow for anonymized interactions with fewer boundaries and regulations (Lowry, Zhang, Wang, & Siponen, 2016) and have demonstrated more propagation of hateful posts as compared to nonhateful ones (Mathew, Dutt, Goyal, & Mukherjee, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…Social media algorithms that prioritize attractively and revenue‐generating content over viewer safety may further contribute to exposure to harmful behaviors, as children and parents have less control over the content they engage in (Radesky et al., 2020). For instance, one study found that videos appropriate for children 1–5 years old lead to an inappropriate video recommendation 3.5% of the time within 10 recommendations (Papadamou et al., 2020). Social media platforms allow for anonymized interactions with fewer boundaries and regulations (Lowry, Zhang, Wang, & Siponen, 2016) and have demonstrated more propagation of hateful posts as compared to nonhateful ones (Mathew, Dutt, Goyal, & Mukherjee, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…As our work measures a specific social phenomenon (the QAnon movement), we also rely on research that has measured other phenomena, for example, attacks on the Syrian White Helmets (Wilson and Starbird 2020), the Manosphere (Ribeiro et al 2021), disturbing videos targeting children (Papadamou et al 2020), and rumors in crisis events (Starbird et al 2018b). Most similar to our investigation is from Papasavva et al who investigate the posting behavior, emergent themes, and toxicity of the QAnon movement through characterizing the /v/GreatAwakening subcommunity on voat.co (Papasavva et al 2020).…”
Section: Related Workmentioning
confidence: 99%
“…Figure 1 depicts the recommended video interface. The top 10 is the number that has been used in the literature for examining recommendation effects [43,44]. Note that this list was not the same list as our API search query for fenbendazole.…”
Section: Youtube Data Collectionmentioning
confidence: 99%