AimsThe appropriate use of intravenous (IV) iron is essential to minimise the requirements for erythropoiesis-stimulating agents (ESAs). The clinical efficacy of generic IV iron compared to the original formulation is controversial. We evaluated the changes that were induced after switching from a generic IV iron to an original formulation in a stable, prevalent haemodialysis (HD) population.MethodsA total of 342 patients were included, and the follow-up period was 56 weeks for each formulation. Anaemia parameters and doses of ESA and IV iron were prospectively recorded before and after the switch from generic to original IV iron.ResultsTo maintain the same haemoglobin (Hb) levels after switching from the generic to the original formulation, the requirements for IV iron doses were reduced by 34.3% (from 52.8±33.9 to 34.7±31.8mg/week, p<0.001), and the ESA doses were also decreased by 12.5% (from 30.6±23.6 to 27±21μg/week, p<0.001). The erythropoietin resistance index declined from 8.4±7.7 to 7.4±6.7 IU/kg/week/g/dl after the switch from the generic to the original drug (p = 0.001). After the switch, the transferrin saturation ratio (TSAT) and serum ferritin levels rose by 6.8%(p<0.001) and 12.4%(p = 0.001), respectively. The mortality rate was similar for both periods.ConclusionsThe iron and ESA requirements are lower with the original IV iron compared to the generic drug. In addition, the uses of the original formulation results in higher ferritin and TSAT levels despite the lower dose of IV iron. Further studies are necessary to analyse the adverse effects of higher IV iron dosages.
Background
Besides the classic logistic regression analysis, non-parametric methods based on machine learning techniques such as random forest are presently used to generate predictive models. The aim of this study was to evaluate random forest mortality prediction models in haemodialysis patients.
Methods
Data were acquired from incident haemodialysis patients between 1995 and 2015. Prediction of mortality at 6 months, 1 year and 2 years of haemodialysis was calculated using random forest and the accuracy was compared with logistic regression. Baseline data were constructed with the information obtained during the initial period of regular haemodialysis. Aiming to increase accuracy concerning baseline information of each patient, the period of time used to collect data was set at 30, 60 and 90 days after the first haemodialysis session.
Results
There were 1571 incident haemodialysis patients included. The mean age was 62.3 years and the average Charlson comorbidity index was 5.99. The mortality prediction models obtained by random forest appear to be adequate in terms of accuracy [area under the curve (AUC) 0.68–0.73] and superior to logistic regression models (ΔAUC 0.007–0.046). Results indicate that both random forest and logistic regression develop mortality prediction models using different variables.
Conclusions
Random forest is an adequate method, and superior to logistic regression, to generate mortality prediction models in haemodialysis patients.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.