Educational research has established that learning can be defined as an enduring change in behaviour, which results from practice or other forms of experience. In introductory programming courses, proficiency is typically approximated through relatively small but frequent assignments and tests. Scaling these assessments to track significant behavioural change is challenging due to the subtle and complex metrics that must be collected from large student populations. Based on a four-semester study, we present an analysis of learning tool interaction data collected from 514 students and 38,796 solutions to practice programming exercises. We first evaluate the effectiveness of measuring workflow patterns to detect students at-risk of failure within the first three weeks of the semester. Our early predictor analysis accurately detects 81% of the students who struggle throughout the course. However, our early predictor also captures transient struggling, as 43% of the students who ultimately did well in the course were classified as at-risk. In order to better differentiate sustained versus transient struggling, we further propose a trajectory metric which measures changes in programming behaviour. The trajectory metric detects 70% of the students who exhibit sustained struggling, and mis-classifies only 11% of students who go on to succeed in the course. Overall, our results show how detecting changes in programming behaviour can help us differentiate between learning and struggling in CS1.
Over the past few years, games courses have been gaining in popularity, as there has been growing evidence showing positive enrollment and student engagement results. Nevertheless, new graduates still lack critical teamwork and problemsolving skills required by industry employers. Building upon other game programs that had successful results, we present a game design course developed to attract students of all disciplines. Our course is different because we focus on three main issues directly associated with new graduates entering industry: cooperative learning, peer review, and orientation with a pre-existing large code base. A quantitative analysis reveals both a positive and negative impact on students' interests, in particular within a cohort of non-Computer Science majors. A qualitative analysis reveals the ways in which students were influenced by a course design where assessment was aligned with key industry issues.
Jobs in the computing field demand communication and teamwork skills in addition to programming skills. Focus at the undergraduate level needs to be shifted towards developing these collaborative skills to enable a more smooth transition into employment in industry. The University of Victoria is in its second year of offering a course on game design. In the first offering, new activities were introduced to address issues identified by recent studies on university graduates entering industry. We focused on integrating cooperative learning, group orientation, and peer review activities into the game design process. The course attracted students across multiple disciplines, and an analysis indicated increased student interest in pursuing a computer science degree. Unfortunately, the same pre-and post-surveys suggested that our collaborative activities may have resulted in a decrease in student interest regarding course work and in pursuing studies in game design. In this paper we report on how we used a studio-based pedagogical approach to restructure the peer review activities in our course. In our previous offering, students received peer feedback only on their final game presentation. In our latest offering, we integrated peer review activities into every step of the game development process, allowing students to refine their ideas while progressing through a game project. A quantitative analysis informs us that our refined peer review activities were successful in increasing student presentation confidence, sense of community, and excitement towards their course projects.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.