2022
DOI: 10.48550/arxiv.2202.00799
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

What is the Will of the People? Moderation Preferences for Misinformation

Abstract: To reduce the spread of misinformation, social media platforms may take enforcement actions against offending content, such as adding informational warning labels, reducing distribution, or removing content entirely. However, both their actions and their inactions have been controversial and plagued by allegations of partisan bias. The controversy in part can be explained by a lack of clarity around what actions should be taken, as they may not neatly reduce to questions of factual accuracy. When decisions are… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 25 publications
0
1
0
Order By: Relevance
“…In fact, some users perceive platform labels as judgmental, paternalistic, and against the platform ethos [89]. Another concern is that any centralized decision, by platforms or otherwise, to for instance, down-rank or filter misinformation cannot address the needs of every user, as some users want to be aware of what misinforming content their social circle is exposed to, so that they can talk to them about it [14,51,69,86]. Yet another concern is that for these centralized approaches to work, users have to rely on the trustworthiness, goodwill, and competence of the moderators [66,71]; however, some users consider platforms profit-driven and politically biased [51,89].…”
Section: Centralizationmentioning
confidence: 99%
“…In fact, some users perceive platform labels as judgmental, paternalistic, and against the platform ethos [89]. Another concern is that any centralized decision, by platforms or otherwise, to for instance, down-rank or filter misinformation cannot address the needs of every user, as some users want to be aware of what misinforming content their social circle is exposed to, so that they can talk to them about it [14,51,69,86]. Yet another concern is that for these centralized approaches to work, users have to rely on the trustworthiness, goodwill, and competence of the moderators [66,71]; however, some users consider platforms profit-driven and politically biased [51,89].…”
Section: Centralizationmentioning
confidence: 99%