Objectives: The aim of this study is to provide an overview of the quality of health economic evaluations (HEEs) of prediction models, the evidence used, and the challenges. MethOds: The databases Medline, Embase, Econlit, and the NHS Economic Evaluations Database were systematically searched for HEEs of diagnostic and prognostic risk prediction models. The included HEEs were evaluated on their methodological quality using the Drummond checklist. Furthermore, an item list was developed incorporating descriptive items on the HEE, specific items on the HEE of prediction models, and statistical characteristics of the prediction model that could be incorporated into the evaluation. Results: The database search resulted in 791 unique papers, from which 653 were excluded based on abstract. After assessing full texts, 17 HEEs (all cost-utility studies) were included. A prediction model was compared to current practice in 11 HEEs and to an extended prediction model in 6 HEEs. On a 35-point scale the quality score ranged from 17 to 32 (median 25). In 7 papers there was no overlap between authors of the initial prediction model paper and those of the corresponding HEE. In 5 papers individuals were classified based on a single (set of) threshold(s); based on guidelines in 4 papers and once on expert opinion. In 8 papers the classification threshold was optimized in the CEA itself. A probabilistic sensitivity analysis was not included in 7 papers and uncertainty around predicted risks was only taken into account once. cOnclusiOns: In most papers limited (prediction model) details were available. Potentially due to this lack of evidence and a lack of specific guidelines on HEE of prediction models, a large variety in the quality and methodology was observed. This variation may complicate the validation and interpretation of HEE results and thereby the decision making on implementation of prediction models in practice.Objectives: To understand the key functional differences between conventional cost-effectiveness Excel and web based model types. MethOds: An online survey consisting 18 end users and 5 model owners (n= 23) was conducted. Respondents were asked to rate key criteria of both model types on a scale from 0 to 10. Model types were compared with the following 13 criterias: model execution speed and size, general functionality support, accessibility, usability, model management and versioning, ease of localization, ease of model core modification, sharing, review process, usage analytics, integration with other content. No weighting to the scoring across criteria was applied. Results: Results of the survey indicate that web based models outperform standalone models in 10 of the 13 criteria assessed. Model review process, ease of model core modification and execution speed was rated higher for conventional standalone Excel models. 80% of model owners and 78% of model users assigned higher overall score for web based models compared to Excel models. cOnclusiOns: Web based models offer advantages primarily related to model...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.