Learning technologies enable interventions in the learning process aiming to improve learning. Learning analytics provides such interventions based on analysis of learner data, which are believed to have beneficial effects on both learning and the learning environment. Literature reporting on the effects of learning analytics interventions on learning allows us to assess in what way learning analytics improves learning. No standard set of operational definitions for learning affected by learning analytics interventions is available. We performed a systematic literature review of 1932 search hits, which yielded 62 key studies. We analyzed how affected learning was operationalized in these key studies and classified operational definitions into three categories: 1) learning environment; 2) learning process; and 3) learning outcome. A deepening analysis yielded a refined classification scheme with 11 subcategories. Most of the analyzed studies relate to either learning outcome or learning process. Only nine of the key studies relate to more than one category. Given the complex nature of applying learning analytics interventions in practice, measuring the effects on a wider spectrum of aspects can give more insight into the workings of learning analytics interventions on the different actors, processes, and outcomes involved. Based on the results of our review, we recommend making deliberate decisions on the (multiple) aspects of learning one tries to improve by applying learning analytics. Our refined classification with examples of operational definitions may help both academics and practitioners doing so, as it allows for a more structured, grounded, and comparable positioning of learning analytics benefits.
Despite the promises of learning analytics and the existence of several learning analytics implementation frameworks, the large-scale adoption of learning analytics within higher educational institutions remains low. Extant frameworks either focus on a specific element of learning analytics implementation, for example, policy or privacy, or lack operationalization of the organizational capabilities necessary for successful deployment. Therefore, this literature review addresses the research question “<em>What capabilities for the successful adoption of learning analytics can be identified in existing literature on big data analytics, business analytics, and learning analytics?”</em> Our research is grounded in resource-based view theory and we extend the scope beyond the field of learning analytics and include capability frameworks for the more mature research fields of big data analytics and business analytics. This paper’s contribution is twofold: 1) it provides a literature review on known capabilities for big data analytics, business analytics, and learning analytics and 2) it introduces a capability model to support the implementation and uptake of learning analytics. During our study, we identified and analyzed 15 key studies. By synthesizing the results, we found 34 organizational capabilities important to the adoption of analytical activities within an institution and provide 461 ways to operationalize these capabilities. Five categories of capabilities can be distinguished – <em>Data, Management, People, Technology</em>, and <em>Privacy & Ethics.</em> Capabilities presently absent from existing learning analytics frameworks concern <em>sourcing and integration, market, knowledge, training, automation, </em>and <em>connectivity</em>. Based on the results of the review, we present the Learning Analytics Capability Model: a model that provides senior management and policymakers with concrete operationalizations to build the necessary capabilities for successful learning analytics adoption.
Digital innovation in education – as in any other sector – is not only about developing and implementing novel ideas, but also about having these ideas effectively used as well as widely accepted and adopted, so that many students can benefit from innovations improving education. Effectiveness, transferability and scalability cannot be added afterwards; it must be integrated from the start in the design, development and implementation processes, as is proposed in the movement towards evidenceinformed practice (EIP). The impact an educational innovation has on the values of various stakeholders is often overlooked. Value Sensitive Design (VSD) is an approach to integrate values in technological design. In this paper we discuss how EIP and VSD may be combined into an integrated approach to digital innovation in education, which we call value-informed innovation. This approach not only considers educational effectiveness, but also incorporates the innovation’s impact on human values, its scalability and transferability to other contexts. We illustrate the integrated approach with an example case of an educational innovation involving digital peer feedback.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.