by Mason Ross Hayes
scite is on a mission to make science more reliable — and this mission goes far beyond building tools for researchers, publishers, or institutions. Because science is a part of our everyday lives: from news articles to Twitter posts, many of us come across scientific claims every day either in the form of interesting facts, anecdotal evidence, or misinformation. But the time and effort needed to evaluate every scientific claim we come across is simply too much. When a study is cited, we need a quick, reliable way for even non-experts to understand if its findings are reliable.
In this post, I will show how scite can help you easily evaluate scientific claims that you come across online.
When first starting to learn about a new subject, scientific or not, YouTube is the default search choice for millions of people. When discovering a new topic for the first time, it is hard to beat the quickness and clarity of a video, which is why many publishers are starting to adopt video explanations and graphical abstracts. Videos provide an entertaining and easily digestible introduction to even the most complex ideas. Perhaps this is why educational videos are some of the most popular on YouTube, with more than 1/3 of US adults reporting that YouTube is important for learning how to do something new.
But beginning to learn something new, especially via YouTube, comes with many questions, perhaps most prominently: how can I trust this? When many of those videos present information that conflicts with one another, contradicts the prominent scientific literature, or just seems strange, how can we be sure that what they say is reliable?
If you’re not sure what I mean, let me provide an example from my own experience.
Like many others staying inside during this pandemic, I have started working out at home. And as I’ve begun, I have had many questions: what are the best workouts to do from home? How frequently should I exercise? What type of food should I eat? And, like many others, I’ve used YouTube to find answers to these questions.
For every question I ask, though, it seems there is a video with a different answer. When there are so many videos that contradict one another, how do we choose which information is most credible? How do we sort the helpful information from the misleading or the exaggerated?
Take, for example, this video from Jeff Nippard on workout volume (the amount of weight multiplied by the number of repetitions), in which he cites the study “Evidence of a Ceiling Effect for Training Volume in Muscle Hypertrophy and Strength in Trained Men — Less is More?” by Barbalho et al. 2019.
When citing the study, he notes that “the results look a bit confusing at first” and that they “contradict the prevailing volume model”. These results, as well as his comments on them, instantly got me curious! With such confusing results, we should expect strong evidence to support them.
Apparently, however, the results were confusing for a reason, and the evidence for them was not sufficient. On 16 April 2020, about 5 months after Nippard’s video was released, the study by Barbalho et al. was retracted. But the video on the article has to date received nearly 1.3 million views, and continues receiving thousands of new views each week.
Most YouTube users view educational videos precisely because they are quick and convenient. So of course, very few people bother to spend the time and energy finding and evaluating the references in an educational video (how many of us even read the video description anyway?). How could we get more people interested in checking the reliability of references behind the information they find online?
Well, we can make finding and evaluating those references as quick and convenient as watching a video! This is one thing that scite excels at: if you want to see how others are discussing an article or if you want to check that a reference is reliable, scite makes even the most complex topics easily accessible to experts and novices alike.
By taking a look at the scite report for an article, anyone can understand the article better.
Anyone who clicks the link to the original article will also see that it has been retracted, along with the retraction notice. However, the retraction notice, in this case, provides very little information, saying only that the authors did “an a posteriori analysis of the data and identified inconsistencies that changed their evaluation of the results.” And how do we find out what these inconsistencies were? Well, it is not really clear. Now let’s take a closer look at the scite report of the Barbalho et al. (2019), which provides us more information.
The scite report page shows the context of each citation to an article and classifies those citations as supporting, disputing, or mentioning. This combination of citation context and citation classification is what we call a Smart Citation. (If you want to learn more about the details of how citations are extracted and classified, click here).
Smart Citations allow you to see exactly how other articles are citing a particular article, giving you more information than a citation count or a retraction notice can provide — and in a shorter, more digestible format. With them, in addition to seeing that the Barbalho et al. article has been retracted, we can easily gain insight into why it was retracted, and see how other researchers are citing the article.
From the scite report, we see that the “inconsistencies” mentioned in the retraction notice came from an undergraduate student “transferring the data from paper to Excel”. Now, it is much more clear why the article was retracted.
But this citation came from a YouTube video, not a scientific article; and, since it was released before the retraction, we might think that it isn’t that big of a deal. But this problem extends far beyond YouTube or social media: citations to retracted studies occur often even in scientific journals. Two separate articles have cited the Barbalho et al. study after it was retracted on 16 April 2020:
scite provides tools for experts and novices alike to quickly see if a study is reliable. In addition to a notice for retracted articles, scite’s Smart Citations allow anyone to look behind the curtain and understand the context of each citation to a paper, revealing important information that is usually hidden from view. And scite does this quickly and automatically: instead of reading all the papers that cite the paper you’re interested in (this could be dozens of papers), you can get a deep overview of the paper without leaving the scite report page, in a fraction of the time.
While we cannot prevent all errors in science, we are working to make them transparent and, more importantly, to prevent them from occurring in the first place. With Reference Check now being integrated into many different peer review systems, we hope to help authors ensure that they cite reliable references and avoid unintentionally citing retracted or highly disputed studies. Bringing more awareness to this issue, on Twitter the scite ref check bot is at work every day finding and tweeting out papers that cite retracted articles post-retraction.
Want to see how scite can help you? The scite Zotero plug-in and Browser extension are available, free and open-source, for anyone who wants to harness the power of Smart Citations.