The 2020 pandemic imposed new demands on teaching practices to support student's distance learning process. In this context, automated assessment (AA) is a pivotal resource that offers immediate and automatic feedback on students' programming tasks. Although the literature provides several contributions regarding AA of Programming Exercises (PEs), very few works discuss the automatic generation of personalized PE. This study reports our experience in applying a new proposal for AA‐PE in an Introduction to Programming (IP) course for a large group of students. This proposal's key feature is the ability to apply AA‐PE and parameterized unified exams to different programming languages by using the open‐source tools MCTest and Moodle (with virtual programming lab [VPL] plugin). During the first quarter of 2019, teachers of 19 of 44 IP‐FF (face‐to‐face) classes embraced our approach as a component in their pedagogical intervention. These classes achieved a higher pass rate (67.5%) than those that did not adopt our AA solution (59.1%), whereas the standard deviation was quite the same (22.5% and 21.3%, respectively). Additionally, preliminary results revealed a strong linear correlation (r = .93) between the pass rate and the average grade of the AA‐PE. In IP‐BL (blended learning), two classes used our method in the exams, with 171 students and a pass rate of 70.4%. These results corroborate previous works that continuous assessment combined with immediate feedback can contribute to students' learning process.