In this paper, we argue that WhatsApp can play an important role in correcting misinformation. We show how specific WhatsApp affordances (flexibility in format and audience selection) and existing social capital (prevalence of strong ties; homophily in political groups) can be leveraged to maximize the re-sharing of debunking messages, such as those accessed by WhatsApp users via ChatBots and Tip-Lines. Debunking messages received in the format of audio files generated more interest and were more effective in correcting beliefs than text- or image-based messages. In addition, we found clear evidence that users re-share debunks at higher rates when they received them from people close to them (strong ties), from individuals who generally agree with them politically (in-group members), or when both conditions are met. We suggest that WhatsApp leverages our findings to maximize the re-share of those fact-checks that are already circulating on the platform by using the existing social capital in the network, unlocking the potential for such debunks to reach a larger audience on WhatsApp.
Moderating content on social media can lead to severe psychological distress. However, little is known about the type, severity, and consequences of distress experienced by volunteer content moderators (VCMs), who do this work voluntarily. We present results from a survey that investigated why Facebook Group and subreddit VCMs quit, and whether reasons for quitting are correlated with psychological distress, demographics, and/or community characteristics. We found that VCMs are likely to experience psychological distress that stems from struggles with other moderators, moderation team leads’ harmful behaviors, and having too little available time, and these experiences of distress relate to their reasons for quitting. While substantial research has focused on making the task of detecting and assessing toxic content easier or less distressing for moderation workers, our study shows that social interventions for VCM workers, for example, to support them in navigating interpersonal conflict with other moderators, may be necessary.
With the increasing dominance of internet as a source of news consumption, there has been a rise in the production and popularity of email newsletters compiled by individual journalists. However, there is little research on the processes of aggregation, and how these differ between expert journalists and trained machines. In this paper, we interviewed journalists who curate newsletters from around the world. Through an in-depth understanding of journalists' workflows, our findings lay out the role of their prior experience in the value they bring into the curation process, their own use of algorithms in finding stories for their newsletter, and their internalization of their readers' interests and the context they are curating for. While identifying the role of human expertise, we highlight the importance of hybrid curation and provide design insights on how technology can support the work of these experts.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.