Early T-cell precursor (ETP) acute lymphoblastic leukemia (ALL), was first identified within cases of childhood T-ALL based on its unique immunophenotypic and genetic features of limited (early) T-cell differentiation associated with (some) myeloid and stem cell features. 1 Thus ETP-ALL blasts express CD7, dim CD5 (<75% positive cells), in the absence of CD1a and CD8, and positivity for ≥1 myeloid/stem cell related markers (i.e., CD34, CD13 or CD33). 1,2 In turn, ETP-ALL frequently shows myeloid-associated gene alterations such as FLT3, NRAS/KRAS, DNMT3A, IDH1 and IDH2 mutations, 3,4 with lower frequencies of other T-ALL-associated mutations (e.g., NOTCH1 and CDKN2A/B gene mutations). 5,6 The World Health Organization (WHO) 2016 classification of ALL included ETP-ALL for the first time, as a provisional entity, 7 but it failed to establish robust diagnostic criteria. Thus, after the first immunophenotypic characterization of ETP-ALL by Coustan-Smith et al. 1 the proposed criteria did not allow identification of all ETP-ALL cases as detected by gene expression profiling. 2 In addition, the "partial CD5 expression" criterion had a negative impact on the reproducibility of ETP-ALL diagnoses because of the lack of standardization of the method used for its assessment. Because of this, Zuubier et al. proposed refined immunophenotypic criteria by excluding CD5 expression while adding negativity for CD4. 2 From the clinical point of view early studies based on limited numbers of pediatric patients indicated that ETP-ALL was associated with a very poor outcome. 1,8,9 More recent data, based on larger series of children treated with more intensive therapy, showed no significant differences in outcome for ETP-ALL vs. other T-ALL cases. 10 In contrast, limited data have been reported for adult ETP-ALL, with conflicting results. 11,12 In one study, adult ETP-ALL was associated with a worse prognosis following different frontline chemotherapy schedules. 11 The