Study aims were to investigate how usable COVID-19 dedicated state public health websites in the US were, and whether case counts in different geographical regions in the US were related to website usability. 16 state websites representing the 2 highest and the 2 lowest case count states in each region were selected. Five experts used a heuristic evaluation procedure to independently rate all 16 websites on a severity scale of 0–4. Usability criteria published by the US Department of Health and Human Services and criteria on risk communication and data dashboards were used. Analyses involved cross tabulation of usability criteria with case counts, comparison of usability scores using Mood's median tests, test of differences in average usability scores using ANOVA and post-hoc tests, and identification of correlations between case counts and usability scores. Results from the Mood's median test showed that the median usability scores for the states were significantly different from each other at the 5% level of significance
(df
=
15, chi-square
=
38.40; p
=
0.001).
ANOVA showed statistically significant differences between the mean usability scores for the states at the 5% level of significance
(F
=
6.33, p
<
0.05)
. Although not statistically significant, results from a correlation analysis between case count and usability scores showed a negative correlation
(r
=
-0.209, p
=
0.437)
indicating that the higher the case count, the better the usability score. Overall, the websites fared well on usability, but many websites were used as an information and data repository. These websites must communicate infection risk better.
Relevance to industry
The study applies to public health agency websites that communicate essential information during a pandemic.
Healthcare provider workflows for documenting and tracking patients, hitherto predominantly manual and paper-based, have recently become significantly computerized. Computerization has made the storage, processing and retrieval of information easier. However, it has increased the potential for errors, impacted direct patient care, and burdened providers with documentation. This paper describes a study to understand information use among providers in intensive care units, and inpatient and outpatient units. Findings on the extent of provider time spent on computer use and impact on direct patient care are described.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.