2021
DOI: 10.1145/3449092
|View full text |Cite
|
Sign up to set email alerts
|

Exploring Lightweight Interventions at Posting Time to Reduce the Sharing of Misinformation on Social Media

Abstract: When users on social media share content without considering its veracity, they may unwittingly be spreading misinformation. In this work, we investigate the design of lightweight interventions that nudge users to assess the accuracy of information as they share it. Such assessment may deter users from posting misinformation in the first place, and their assessments may also provide useful guidance to friends aiming to assess those posts themselves. In support of lightweight assessment, we first deve… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

4
46
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 75 publications
(50 citation statements)
references
References 41 publications
4
46
0
Order By: Relevance
“…This account posits that certain features of social networks favour the dissemination of interesting and unexpected content at the expense of accuracy 6 , 60 . Recent research in this field has found both laboratory and field evidence that accuracy of content is often overlooked and that simple cues reminding participants to evaluate the accuracy of content reduce participants’ willingness to share fake news 35 , 37 , 61 – 63 (or possibly increase true news sharing 38 ). Increasing accuracy through incentives is not an entirely novel idea in social media either, as shown in a recent initiative promoted by Twitter 64 .…”
Section: Introductionmentioning
confidence: 99%
“…This account posits that certain features of social networks favour the dissemination of interesting and unexpected content at the expense of accuracy 6 , 60 . Recent research in this field has found both laboratory and field evidence that accuracy of content is often overlooked and that simple cues reminding participants to evaluate the accuracy of content reduce participants’ willingness to share fake news 35 , 37 , 61 – 63 (or possibly increase true news sharing 38 ). Increasing accuracy through incentives is not an entirely novel idea in social media either, as shown in a recent initiative promoted by Twitter 64 .…”
Section: Introductionmentioning
confidence: 99%
“…A body of research on intervention strategies to reduce the psychological effects of misinformation is also growing [16,20,27,37,38,46]. Guess et al [20] tested a digital media literacy intervention in the U.S. and India.…”
Section: Prior Research On Misinformationmentioning
confidence: 99%
“…They found that interaction with the conspiracy theory content related to global warming decreased individuals' confidence in the scientific consensus and reduced proenvironmental decision making. Loomba et al [34] conducted a randomized controlled trial and found that the exposure to misinformation about COVID-19 vaccines decreased individuals' intent to get vaccinated.A body of research on intervention strategies to reduce the psychological effects of misinformation is also growing [16,20,27,37,38,46]. Guess et al [20] tested a digital media literacy intervention in the U.S. and India.…”
mentioning
confidence: 99%
“…This 'implicit values' perspective can be used to identify what values of online communities are most studied by the research community, and conversely, what values are most understudied ( §5.1). There is a great deal of work implicitly valuing the trustworthiness and factualness of information shared within communities (i.e., the absence of mis/disinformation) [3,8,30,36,37,44,54,65,74,87,96]. Similarly, the absence of abusive, harassing, or spammy content is widely studied [9,11,13,31,60,61,66,91], as is compliance with community rules and norms [12,58,60].…”
Section: Implicit Values In Prior Researchmentioning
confidence: 99%