To determine whether measurements of Endothelium/Descemet complex thickness (En/DMT) are of predictive value for corneal graft rejection after high-risk corneal transplantation, we conducted this prospective, single-center, observational case series including sixty eyes (60 patients) at high risk for corneal graft rejection (GR) because of previous immunologic graft failure or having at least two quadrants of stromal vascularization. Patients underwent corneal transplant. At 1st, 3rd, 6th, 9th, and 12th postoperative month, HD-OCT imaging of the cornea was performed, and the corneal status was determined clinically at each visit by a masked cornea specialist. Custom-built segmentation tomography algorithm was used to measure the central En/DMT. Relationships between baseline factors and En/DMT were explored. Time dependent covariate Cox survival regression was used to assess the effect of post-operative En/DMT changes during follow up. A longitudinal repeated measures model was used to assess the relationship between En/DMT and graft status. Outcome measures included graft rejection, central Endothelium/Descemet’s complex thickness, and central corneal thickness (CCT). In patients with GR (35%), the central En/DMT increased significantly 5.3 months (95% CI: 2, 11) prior to the clinical diagnosis of GR, while it remained stable in patients without GR. During the 1-year follow up, the rejected grafts have higher mean pre-rejection En/DMTs (p = 0.01), compared to CCTs (p = 0.7). For En/DMT ≥ 18 µm cut-off (at any pre-rejection visit), the Cox proportional hazard ratio was 6.89 (95% CI: 2.03, 23.4; p = 0.002), and it increased to 9.91 (95% CI: 3.32, 29.6; p < 0.001) with a ≥ 19 µm cut-off. In high-risk corneal transplants, the increase in En/DMT allowed predicting rejection prior to the clinical diagnosis.