It has now been over 60 years since U.S. nuclear testing was conducted in the Pacific islands and Nevada, exposing military personnel to varying levels of ionizing radiation. Actual doses are not well-established, as film badges in the 1950s had many limitations. We sought a means of independently assessing dose for comparison with historical film badge records and dose reconstruction conducted in parallel. For the purpose of quantitative retrospective biodosimetry, peripheral blood samples from 12 exposed veterans and 12 age-matched (>80 years) veteran controls were collected and evaluated for radiation-induced chromosome damage utilizing directional genomic hybridization (dGH), a cytogenomics-based methodology that facilitates simultaneous detection of translocations and inversions. Standard calibration curves were constructed from six male volunteers in their mid-20s to reflect the age range of the veterans at time of exposure. Doses were estimated for each veteran using translocation and inversion rates independently; however, combining them by a weighted-average generally improved the accuracy of dose estimations. Various confounding factors were also evaluated for potential effects on chromosome aberration frequencies. Perhaps not surprisingly, smoking and age-associated increases in background frequencies of inversions were observed. Telomere length was also measured, and inverse relationships with both age and combined weighted dose estimates were observed. Interestingly, smokers in the non-exposed control veteran cohort displayed similar telomere lengths as those in the never-smoker exposed veteran group, suggesting that chronic smoking had as much effect on telomere length as a single exposure to radioactive fallout. Taken together, we find that our approach of combined chromosome aberration-based retrospective biodosimetry provided reliable dose estimation capability, particularly on a group average basis, for exposures above statistical detection limits.