Abstract. It has been long argued that learning analytics has the potential to act as a "middle space" between the learning sciences and data analytics, creating technical possibilities for exploring the vast amount of data generated in online learning environments. One common learning analytics intervention is the learning dashboard, a support tool for teachers and learners alike that allows them to gain insight into the learning process. Although several related works have scrutinised the state-of-the-art in the field of learning dashboards, none have addressed the theoretical foundation that should inform the design of such interventions. In this systematic literature review, we analyse the extent to which theories and models from learning sciences have been integrated into the development of learning dashboards aimed at learners. Our critical examination reveals the most common educational concepts and the context in which they have been applied. We find evidence that current designs foster competition between learners rather than knowledge mastery, offering misguided frames of reference for comparison.
This paper aims to link Learning Analytics Dashboards (LADs) to the corpus of research on Open Learner Models (OLMs), as both have similar goals. We conducted a systematic review of work on OLMs and compare this with LADs for learners in terms of (i) data use and modelling, (ii) key publication venues, (iii) authors and articles, (iv) key themes, and (v) system evaluation. We highlight the similarities and differences between the research on LADs and OLMs. Our key contribution is a bridge between these two areas as a foundation for building upon the strengths of each. We report the following key results from the review: in reports of new OLMs, almost 60% are based on a single type of data; 30-40% use behavioural metrics, support input from the user, or have complex models; and just 6% involve multiple applications. Key associated themes include intelligent tutoring systems, learning analytics, and selfregulated learning. Notably, compared with LADs, OLM research is more likely to be interactive (81% of papers compared with 31% for LADs), report evaluations (76% versus 59%), use assessment data (100% versus 37%), provide a comparison standard for students (52% versus 38%), but less likely to use behavioural metrics (33% against 75% for LADs). In OLM work, there was a heightened focus on learner control and access to their own data. CCS CONCEPTS • Human-centered computing~Visualization application domains • Human-centered computing~Visualization systems and tools
This paper introduces a learning analytics policy and strategy framework developed by a cross-European research project team -SHEILA 1 (Supporting Higher Education to Integrate Learning Analytics), based on interviews with 78 senior managers from 51 European higher education institutions across 16 countries. The framework was developed adapting the RAPID Outcome Mapping Approach (ROMA), which is designed to develop effective strategies and evidence-based policy in complex environments. This paper presents four case studies to illustrate the development process of the SHEILA framework and how it can be used iteratively to inform strategic planning and policy processes in real world environments, particularly for large-scale implementation in higher education contexts. To this end, the selected cases were analyzed at two stages, each a year apart, to investigate the progression of adoption approaches that were followed to solve existing challenges, and identify new challenges that could be addressed by following the SHEILA framework.
Notes for Practice This paper presents a framework that can be used to assist with strategic planning and policy processes for learning analytics. This research builds on the RAPID Outcome Mapping Approach (ROMA) and adapts it by including elements of actions, challenges, and policy prompts. The proposed framework was developed based on the experiences of learning analytics adoption at 51 European higher education institutions. The proposed framework will enhance systematic adoption of learning analytics on a wide scale.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.