2020
DOI: 10.1145/3392854
|View full text |Cite
|
Sign up to set email alerts
|

Measuring Misinformation in Video Search Platforms: An Audit Study on YouTube

Abstract: Search engines are the primary gateways of information. Yet, they do not take into account the credibility of search results. There is a growing concern that YouTube, the second largest search engine and the most popular video-sharing platform, has been promoting and recommending misinformative content for certain search topics. In this study, we audit YouTube to verify those claims. Our audit experiments investigate whether personalization (based on age, gender, geolocation, or watch history) contributes to a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
137
0
3

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 167 publications
(151 citation statements)
references
References 42 publications
3
137
0
3
Order By: Relevance
“…YouTube recently vowed to limit recommending conspiracy theories on its platform (Wong & Levin, 2019), a move that highlights the tech industry's unilateral power to shape their users' information diets. In a recent empirical audit of You-Tube recommendations, Hussein, Juneja, and Mitra (2020) found that the YouTube approach indeed limited recommendations of selected conspiracy theories (e.g., the flat-earth narrative) or medical misinformation (videos promoting vaccine hesitancy), but not of other misinformation topics (e.g., the chemtrail conspiracy narrative).…”
Section: Fraud Detectionmentioning
confidence: 99%
“…YouTube recently vowed to limit recommending conspiracy theories on its platform (Wong & Levin, 2019), a move that highlights the tech industry's unilateral power to shape their users' information diets. In a recent empirical audit of You-Tube recommendations, Hussein, Juneja, and Mitra (2020) found that the YouTube approach indeed limited recommendations of selected conspiracy theories (e.g., the flat-earth narrative) or medical misinformation (videos promoting vaccine hesitancy), but not of other misinformation topics (e.g., the chemtrail conspiracy narrative).…”
Section: Fraud Detectionmentioning
confidence: 99%
“…An analysis of 1,300 Facebook pages during the 2019 measles outbreak found anti-vaxx pages grew by 500%, compared to 50% of pro-vaccine pages (16). Individuals can also end in echo chambers; once a YouTube user develops a watch history, a filter bubble tailors their Top 5 and Up-Next recommendations, with watching videos promoting vaccine misinformation leading to more misinformed recommendations (17).…”
Section: Trust Threat and Information Sourcesmentioning
confidence: 99%
“…Individuals can also end in echo chambers; once a YouTube user develops a watch history, a filter bubble tailors their Top 5 and Up-Next recommendations, with watching videos promoting vaccine misinformation leading to more misinformed recommendations (17).…”
Section: Introductionmentioning
confidence: 99%
“…Over 65% of YouTube’s content about vaccines seems to be about discouraging their use—focusing on autism, adverse reactions, or mercury content 11. And search algorithms promote content similar to what users have previously watched, leading people into increasingly narrow echo chambers of disinformation 12. A recent UK study found that users who relied on social media for their information, particularly YouTube, were significantly less willing to be vaccinated 13…”
Section: Yes—melinda Millsmentioning
confidence: 99%