This article addresses a relatively unexplored area in the emerging field of learning analytics, the design of learning analytics interventions. A learning analytics intervention is defined as the surrounding frame of activity through which analytic tools, data, and reports are taken up and used. It is a soft technology that involves the orchestration of the human process of engaging with the analytics as part of the larger teaching and learning activity. This paper first makes the case for the overall importance of intervention design, situating it within the larger landscape of the learning analytics field, and then considers the specific issues of intervention design for student use of learning analytics. Four principles of pedagogical learning analytics intervention design that can be used by teachers and course developers to support the productive use of learning analytics by students are introduced: Integration, Agency, Reference Frame and Dialogue. In addition three core processes in which to engage students are described: Grounding, Goal-Setting and Reflection. These principles and processes are united in a preliminary model of pedagogical learning analytics intervention design for students, presented as a starting point for further inquiry.
The process of using analytic data to inform instructional decision-making is acknowledged to be complex; however, details of how it occurs in authentic teaching contexts have not been fully unpacked. This study investigated five university instructors’ use of a learning analytics dashboard to inform their teaching. The existing literature was synthesized to create a template for inquiry that guided interviews, and inductive qualitative analysis was used to identify salient emergent themes in how instructors 1) asked questions, 2) interpreted data, 3) took action, and 4) checked impact. Findings showed that instructors did not always come to analytics use with specific questions, but rather with general areas of curiosity. Questions additionally emerged and were refined through interaction with the analytics. Data interpretation involved two distinct activities, often along with affective reactions to data: reading data toidentify noteworthy patterns and explaining their importance in the course using contextual knowledge. Pedagogical responses to the analytics included whole-class scaffolding, targeted scaffolding, and revising course design, as well two new non-action responses: adopting a wait-and-see posture and engaging in deep reflection on pedagogy. Findings were synthesized into a model of instructor analytics use that offers useful categories of activities for future study and support
This paper describes an application of learning analytics that builds on an existing research program investigating how students contribute and attend to the messages of others in asynchronous online discussions. We first overview the E-Listening research program and then explain how this work was translated into analytics that students and instructors could use to reflect on their discussion participation. Two kinds of analytics were designed: some embedded in the learning environment to provide students with real-time information on their activity in-progress; and some extracted from the learning environment and presented to students in a separate digital space for reflection. In addition, we describe the design of an intervention though which use of the analytics can be introduced as an integral course activity. Findings from an initial implementation of the application indicated that the learning analytics intervention supported changes in students' discussion participation. Five issues for future work on learning analytics in online discussions are presented. One, unintentional versus purposeful change; two, differing changes prompted by the same analytic; three, importance of theoretical buy-in and calculation transparency for perceived analytic value; four, affective components of students' reactions; and five, support for students in the process of enacting analytics-driven changes.
ABSTRACT:It is an exhilarating and important time for conducting research on learning, with unprecedented quantities of data available. There is a danger, however, in thinking that with enough data, the numbers speak for themselves. In fact, with larger amounts of data, theory plays an ever-more critical role in analysis. In this introduction to the special section on learning analytics and learning theory, we describe some critical problems in the analysis of large-scale data that occur when theory is not involved. These questions revolve around what variables a researcher should attend to and how to interpret a multitude of micro-results and make them actionable. We conclude our comments with a discussion of how the collection of empirical papers included in the special section, and the commentaries that were invited on them, speak to these challenges, and in doing so represent important steps towards theory-informed and theory-contributing learning analytics work. Our ultimate goal is to provoke a critical dialogue in the field about the ways in which learning analytics research draws on and contributes to theory.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.