Minimal residual disease (MRD) quantification is an important predictor of outcome after treatment for acute lymphoblastic leukemia (ALL). Bone marrow ALL burden ≥ 10−4 after induction predicts subsequent relapse. Likewise, MRD ≥ 10−4 in bone marrow prior to the initiation of conditioning for allogeneic hematopoietic cell transplantation (allo-HCT) predicts transplant failure. Current methods for MRD quantification in ALL are not sufficiently sensitive for use with peripheral blood specimens and have not been broadly implemented in the management of adults with ALL. Consensus primed immunoglobulin (Ig) and T-cell receptor (TCR) amplification and high-throughput sequencing (HTS) permits use of a standardized algorithm for all patients and can detect leukemia at 10−6 or lower. We applied the Sequenta LymphoSIGHT™ HTS platform to quantification of MRD in 237 samples from 29 adult B-ALL patients before and after allo-HCT. Using primers for the IGH-VDJ, IGH-DJ, IGK, TCRB, TCRD, and TCRG loci, MRD could be quantified in 93% of patients. Leukemia-associated clonotypes at these loci were identified in 52%, 28%, 10%, 35%, 28%, and 41% of patients, respectively. MRD ≥ 10−4 before HCT conditioning predicted post-HCT relapse (HR 7.7, 95% CI 2.0–30, p=0.003). In post-HCT blood samples, MRD ≥ 10−6 had 100% positive predictive value for relapse with median lead-time of 89 days (HR 14; 95% CI 4.7–44, p<0.0001). The use of HTS-based MRD quantification in adults with ALL offers a standardized approach with sufficient sensitivity to quantify leukemia MRD in peripheral blood. Use of this approach may identify a window for clinical intervention prior to overt relapse.