Hemoglobin levels vary substantially over time in hemodialysis patients, and this variability may portend poor outcomes. For a given patient, hemoglobin concentration over time can be described by absolute levels, rate of change, or by the difference between observed level and expected level based on the preceding trend (i.e., seemingly random variability). We investigated the independent associations of these different methods of describing hemoglobin over time with mortality in a retrospective cohort of 34,963 hemodialysis patients. Hemoglobin concentration over time was modeled with linear regression for each subject, and the model was then used to define the subject's absolute level of hemoglobin (intercept), temporal trend in hemoglobin (slope), and hemoglobin variability (residual standard deviation). Survival analyses indicated that each 1g/dl increase in the residual standard deviation was associated with a 33% increase in rate of death, even after adjusting for multiple covariates. Patient characteristics accounted for very little of the variation in our hemoglobin variability metric (R 2 ϭ 0.019). We conclude that greater hemoglobin variability is independently associated with higher mortality.