PurposeACR and AAPM task group's guidelines addressing commissioning for dedicated MR simulators were recently published. The goal of the current paper is to present the authors' 2‐year experience regarding the commissioning and introduction of a QA program based on these guidelines and an associated automated workflow.MethodsAll mandatory commissioning tests suggested by AAPM report 284 were performed and results are reported for two MRI scanners (MAGNETOM Sola and Aera). Visual inspection, vendor clinical or service platform, third‐party software, or in‐house python‐based code were used. Automated QA and data analysis was performed via vendor, in‐house or third‐party software. QATrack+ was used for QA data logging and storage. 3D geometric distortion, B0 inhomogeneity, EPI, and parallel imaging performance were evaluated.ResultsContrasting with AAPM report 284 recommendations, homogeneity and RF tests were performed monthly. The QA program allowed us to detect major failures over time (shimming, gradient calibration and RF interference). Automated QA, data analysis, and logging allowed fast ACR analysis daily and monthly QA to be performed in 3 h. On the Sola, the average distortion is 1 mm for imaging radii of 250 mm or less. For radii of up to 200 mm, the maximum, average (standard deviation) distortion is 1.2 and 0.4 mm (0.3 mm). Aera values are roughly double the Sola for radii up to 200 mm. EPI geometric distortion, ghosting ratio, and long‐term stability were found to be under the maximum recommended values. Parallel imaging SNR ratio was stable and close to the theoretical value (ideal g‐factor). No major failures were detected during commissioning.ConclusionAn automated workflow and enhanced QA program allowed to automatically track machine and environmental changes over time and to detect periodic failures and errors that might otherwise have gone unnoticed. The Sola is more geometrically accurate, with a more homogenous B0 field than the Aera.