This study aims to investigate students' learning effects and performance trends over a certain timeframe when using the Rancor Microworld simulator. Specifically, it focuses on generating insights into the amount of training required to collect human reliability analysis (HRA) data from non-experts (i.e., students) using Rancor Microworld. A longitudinal experiment was conducted with 16 undergraduate students in department of nuclear engineering at Republic of Korea. It consisted of four consecutive trials, each featuring four different Rancor Microworld scenarios. The study considers four human performance measures: workload, situation awareness, task completion time, and accuracy. Finally, the student performance trends were compared with the operator performance data collected from a previous experiment. The collected data were analyzed by ANOVA test. The result of study shows that the statistically significant deference in three human performance measures: situation awareness, task completion time, and accuracy due to the number of trials. Moreover, students' accuracy became similar to the level of actual operators. Overall, this research complements Idaho National Laboratory (INL)'s Simplified Human Error Experimental Program (SHEEP) framework for collecting HRA data, as the results provide insights into the training required to collect HRA data from non-experts, as well as into the human performance differences seen when comparing students with professional operators.