Extended Data Fig. 2 | Distribution of sharing intentions in studies 3 and 4, by condition and headline veracity. Whereas Fig. 2 discretizes the sharing intention variable for ease of interpretation such that all 'unlikely' responses are scored as 0 and all 'likely' responses are scored as 1, here the full distributions are shown. The regression models use these non-discretized values. Extended Data Fig. 3 | Distribution of sharing intentions in study 5, by condition and headline veracity. Whereas Fig. 2 discretizes the sharing intention variable for ease of interpretation such that all 'unlikely' responses are scored as 0 and all 'likely' responses are scored as 1, here the full distributions are shown. The regression models use these non-discretized values.
Online labor markets provide new opportunities for behavioral research, but conducting economic experiments online raises important methodological challenges. This particularly holds for interactive designs. In this paper, we provide a methodological discussion of the similarities and differences between interactive experiments conducted in the laboratory and online. To this end, we conduct a repeated public goods experiment with and without punishment using samples from the laboratory and the online platform Amazon Mechanical Turk. We chose to replicate this experiment because it is long and logistically complex. It therefore provides a good case study for discussing the methodological and practical challenges of online interactive experimentation. We find that basic behavioral patterns of cooperation and punishment in the laboratory are replicable online. The most
The spread of false and misleading news content on social media is of great societal concern. Why do people share such content, and what can be done about it? In a first survey experiment (N=1,015), we demonstrate a disconnect between accuracy judgments and sharing intentions: even though true headlines are rated as much more accurate than false headlines, headline veracity has little impact on sharing. Although this may seem to indicate that people share inaccurate content because they care more about furthering their political agenda than they care about truth, we propose an alternative attentional account: Most people do not want to spread misinformation, but the social media context focuses their attention on factors other than truth and accuracy. Indeed, when directly asked, most participants say it is important to only share news that is accurate. Accordingly, across four survey experiments (total N=3,485) and a digital field experiment on Twitter in which we messaged users who had previously shared news from websites known for publishing misleading content (N=5,379), we find that inducing people to think about the concept of accuracy increases the quality of the news they subsequently share. Together, these results challenge the narrative that people no longer care about accuracy. Instead, the results support our inattention-based account wherein people fail to implement their preference for accuracy due to attentional constraints. Furthermore, our research provides evidence for scalable anti-misinformation interventions that are easily implementable by social media platforms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.