Whereas in other countries experts diagnose a “crisis” of science reporting, Germany has seen an unprecedented boom in science journalism. But is this boom limited to science sections themselves, or does it spread also into other sections? And how does this increase in coverage influence the kind of reporting itself? Analysis of 4,077 articles in three nationwide newspapers in two periods (2003–2004 and 2006–2007) indicates an overall increase of science reporting by 48%; outside the science sections of the newspapers the increase is by 136%.
The Covid-19 pandemic has immediate effects on science journalism and science communication in general, which in a few cases are atypical and likely to disappear again after the crisis. However, from a German perspective, there is some evidence that the crisis—and its accompanying ‘infodemic’—has, above all, accelerated and made more visible existing developments and deficits as well as an increased need for funding of science journalism.
While the quality of environmental science journalism has been the subject of much debate, a widely accepted benchmark to assess the quality of coverage of environmental topics is missing so far. Therefore, we have developed a set of defined criteria of environmental reporting. This instrument and its applicability are tested in a newly established monitoring project for the assessment of pieces on environmental issues, which refer to scientific sources and therefore can be regarded as a special field of science journalism. The quality is assessed in a kind of journalistic peer review. We describe the systematic development of criteria, which might also be a model procedure for other fields of science reporting. Furthermore, we present results from the monitoring of 50 environmental reports in German media. According to these preliminary data, the lack of context and the deficient elucidation of the evidence pose major problems in environmental reporting.
The quality and authenticity of images is essential for data presentation, especially in the life sciences. Questionable images may often be a first indicator for questionable results, too. Therefore, a tool that uses mathematical methods to detect suspicious images in large image archives can be a helpful instrument to improve quality assurance in publications. As a first step towards a systematic screening tool, especially for journal editors and other staff members who are responsible for quality assurance, such as laboratory supervisors, we propose a basic classification of image manipulation. Based on this classification, we developed and explored some simple algorithms to detect copied areas in images. Using an artificial image and two examples of previously published modified images, we apply quantitative methods such as pixel-wise comparison, a nearest neighbor and a variance algorithm to detect copied-and-pasted areas or duplicated images. We show that our algorithms are able to detect some simple types of image alteration, such as copying and pasting background areas. The variance algorithm detects not only identical, but also very similar areas that differ only by brightness. Further types could, in principle, be implemented in a standardized scanning routine. We detected the copied areas in a proven case of image manipulation in Germany and showed the similarity of two images in a retracted paper from the Kato labs, which has been widely discussed on sites such as pubpeer and retraction watch.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.