Online learning is the fastest growing segment in U.S. higher education and is increasingly adopted in public and private not-for-profit institutions. While the impact of online learning on educational outcomes is becoming more clear, the literature on its connection with student engagement is sparse. Student engagement measures identify key aspects of the learning process that can improve learning and outcomes like retention and achievement. The few studies investigating the link between online learning and student engagement found positive benefits for online learners compared to face-to-face learners in terms of perceived academic challenge, learning gains, satisfaction, and better study habits. On the other hand, face-to-face learners reported higher levels of environment support, collaborative learning, and faculty interaction. However, these studies did not effectively account for the differences in background characteristics like age, time spent working or caring for dependents, and enrollment status. Further, they did not consider the increasingly large population of students who enroll in both online and face-to-face courses. In our study, we used propensity score matching on the 2015 National Survey of Student Engagement data to account for the disparities in these groups’ demographics variables. After matching, we found that some of the previous literature’s differences diminish or disappear entirely. This suggests differences in supportive environments and learning strategies have more to do with online student characteristics than learning mode. However, online learning still falls well below other modes in terms of collaborative learning and interaction with faculty.
The rise in popularity and use of cognitive diagnostic models (CDMs) in educational research are partly motivated by the models’ ability to provide diagnostic information regarding students’ strengths and weaknesses in a variety of content areas. An important step to ensure appropriate interpretations from CDMs is to investigate differential item functioning (DIF). To this end, the current simulation study examined the performance of three methods to detect DIF in CDMs, with particular emphasis on the impact of Q-matrix misspecification on methods’ performance. Results illustrated that logistic regression and Mantel–Haenszel had better control of Type I error than the Wald test; however, high power rates were found using logistic regression and Wald methods, only. In addition to the tradeoff between Type I error control and acceptable power, our results suggested that Q-matrix complexity and item structures yield different results for different methods, presenting a more complex picture of the methods’ performance. Finally, implications and future directions are discussed.
Cognitive diagnostic models (CDMs) are of growing interest in educational research because of the models’ ability to provide diagnostic information regarding examinees’ strengths and weaknesses suited to a variety of content areas. An important step to ensure appropriate uses and interpretations from CDMs is to understand the impact of differential item functioning (DIF). While methods of detecting DIF in CDMs have been identified, there is a limited understanding of the extent to which DIF affects classification accuracy. This simulation study provides a reference to practitioners to understand how different magnitudes and types of DIF interact with CDM item types and group distributions and sample sizes to influence attribute- and profile-level classification accuracy. The results suggest that attribute-level classification accuracy is robust to DIF of large magnitudes in most conditions, while profile-level classification accuracy is negatively influenced by the inclusion of DIF. Conditions of unequal group distributions and DIF located on simple structure items had the greatest effect in decreasing classification accuracy. The article closes by considering implications of the results and future directions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.