This article is a comprehensive literature review of student-facing learning analytics reporting systems that track learning analytics data and report it directly to students. This literature review builds on four previously conducted literature reviews in similar domains. Out of the 945 articles retrieved from databases and journals, 93 articles were included in the analysis. Articles were coded based on the following five categories: functionality, data sources, design analysis, student perceptions, and measured effects. Based on this review, we need research on learning analytics reporting systems that targets the design and development process of reporting systems, not only the final products. This design and development process includes needs analyses, visual design analyses, information selection justifications, and student perception surveys. In addition, experiments to determine the effect of these systems on student behavior, achievement, and skills are needed to add to the small existing body of evidence. Furthermore, experimental studies should include usability tests and methodologies to examine student use of these systems, as these factors may affect experimental findings. Finally, observational study methods, such as propensity score matching, should be used to increase student access to these systems but still rigorously measure experimental effects.
In this paper, 218 U.S faculty responses regarding Open Educational Resources (OER) were qualitatively analyzed. Ten categories emerged in the coding process. The top three categories that indicated barriers to the adoption of OER were need more information (faculty wanted more information before they would be willing to adopt OER), lack of discoverability (faculty wanted to be able to easily find repositories of OER), and confusing OER with digital resources (faculty were unaware of the difference between digital resources and OER). The top incentives identified in this analysis to overcome these barriers include student cost benefits (saving students money), student pedagogical benefits (faculty being able to make changes to OER to improve course content and instruction), and institutional support for the adoption of OER (whether in the form of course load reduction, curricular research assistance, or library support for finding and adopting OER). Future research is needed to better understand how to address and overcome these barriers to OER adoption.
This paper aims to link Learning Analytics Dashboards (LADs) to the corpus of research on Open Learner Models (OLMs), as both have similar goals. We conducted a systematic review of work on OLMs and compare this with LADs for learners in terms of (i) data use and modelling, (ii) key publication venues, (iii) authors and articles, (iv) key themes, and (v) system evaluation. We highlight the similarities and differences between the research on LADs and OLMs. Our key contribution is a bridge between these two areas as a foundation for building upon the strengths of each. We report the following key results from the review: in reports of new OLMs, almost 60% are based on a single type of data; 30-40% use behavioural metrics, support input from the user, or have complex models; and just 6% involve multiple applications. Key associated themes include intelligent tutoring systems, learning analytics, and selfregulated learning. Notably, compared with LADs, OLM research is more likely to be interactive (81% of papers compared with 31% for LADs), report evaluations (76% versus 59%), use assessment data (100% versus 37%), provide a comparison standard for students (52% versus 38%), but less likely to use behavioural metrics (33% against 75% for LADs). In OLM work, there was a heightened focus on learner control and access to their own data. CCS CONCEPTS • Human-centered computing~Visualization application domains • Human-centered computing~Visualization systems and tools
In this exploratory study we used an intensive longitudinal approach to measure student engagement in a blended educational technology course, collecting both self-report and observational data. The self-report measure included a simple survey of Likert-scale and openended questions given repeatedly during the semester. Observational data were student activity data extracted from the learning management system. We explored how engagement varied over time, both at the course level and between students, to identify patterns and influences of student engagement in a blended course. We found that clarity of instruction and relevance of activities influenced student satisfaction more than the medium of instruction. Student engagement patterns observed in the log data revealed that exploring learning tools and previewing upcoming assignments and learning activities can be useful indicators of a successful learning experience.Future work will investigate these findings on a larger scale.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.