Automatic assessment and evaluation of team performance during collaborative tasks is key to the research on learning analytics and computer-supported cooperative work. There is growing interest in the use of gaze-oriented cues for evaluating the collaboration and cooperativeness of teams. However, collecting gaze data using eye-trackers is not always feasible due to time and cost constraints. In this paper, we introduce an automated team assessment tool based on gaze points and joint visual attention (JVA) information drawn from computer vision solutions. We evaluated team collaborations in an undergraduate anatomy learning activity (N = 60, 30 teams) as a test user study. The results indicate that higher JVA was positively associated with student learning outcomes (r(30) = 0.50, p < 0.005). Moreover, teams who participated in two experimental groups and used interactive 3D anatomy models, had higher JVA (F(1,28) = 6.65, p < 0.05) and better knowledge retention (F(1,28) = 7.56, p < 0.05) than those in the control group. Also, no significant difference was observed based on JVA for different gender compositions of teams. The findings from this work have implications in learning sciences and collaborative computing by providing a novel joint attention-based measure to objectively evaluate team collaboration dynamics.
Implications for practice or policy:
Student learning outcomes can be improved by receiving constructive feedback about team performances using our gaze-based collaborative learning method.
Underrepresented and underserved minorities of science, technology, engineering and mathematics disciplines can be engaged in more collaborative problem-solving and team-based learning activities since our method offers a broader reach by automating collaboration assessment process.
Course leaders can assess the quality of attention and engagement among students and can monitor or assist larger numbers of students simultaneously.