For many years, the journal evaluation system has been centered on impact indicators, resulting in evaluation results that do not reflect the academic innovation of journals. To solve this issue, this study attempts to construct the Journal Disruption Index (JDI) from the perspective of measuring the disruption of each journal article. In the actual study, we measured the disruption of articles of 22 selected virology journals based on the OpenCitations Index of Crossref open DOI-to-DOI citations (COCI) first. Then we calculated the JDI of 22 virology journals based on the absolute disruption index (
) of the articles. Finally, we conducted an empirical study on the differences and correlations between the impact indicators and disruption indicators as well as the evaluation effect of the disruption index. The results of the study show: (1) There are large differences in the ranking of journals based on disruption indicators and impact indicators. Among the 22 journals, 12 are ranked higher by JDI than Cumulative Impact Factor for 5 years (CIF5), the Journal Index for PR6 (JIPR6) and average Percentile in Subject Area (aPSA). The ranking difference of 17 journals between the two kinds of indicators is greater than or equal to 5. (2) There is a medium correlation between disruption indicators and impact indicators at the level of journals and papers. JDI is moderately correlated with CIF5, JIPR6 and aPSA, with correlation coefficients of 0.486, 0.471 and − 0.448, respectively.
was also moderately correlated with Cumulative Citation (CC), Percentile Ranking with 6 Classifications (PR6) and Percentile in Subject Area (PSA) with correlation coefficients of 0.593, 0.575 and − 0.593, respectively. (3) Compared with traditional impact indicators, the results of journal disruption evaluation are more consistent with the evaluation results of experts’ peer review. JDI reflects the innovation level of journals to a certain extent, which is helpful to promote the evaluation of innovation in sci-tech journals.