Learning analytics involve the measurement, collection, analysis, and reporting of data about learners and their contexts, in order to understand and optimize learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and early adopters around the world are developing and deploying these new tools. This paper reports on a study that investigated how the field is likely to develop by 2025 in order to recommend actions to those concerned with the implementation of learning analytics. The study used a Policy Delphi approach, presenting a range of future scenarios to international experts in the field and asking for responses related to the desirability and feasibility of these scenarios, as well as actions that would be required. Responses were received from 103 people from 21 countries. Responses were coded thematically, inter-rater reliability was checked using Cohen's kappa coefficient, and data were recoded if kappa was below 0.6. The seven major themes identified within the data were power, pedagogy, validity, regulation, complexity, ethics, and affect. The paper considers in detail each of these themes and its implications for the implementation of learning analytics. Notes for Practice• This paper analyzes responses from around the world and identifies seven factors that must be taken into account when implementing learning analytics: power, pedagogy, validity, regulation, complexity, ethics, and affect.• Responses revealed widespread unease about how analytics may develop. Bringing people together to engage with and understand the issues will be one way of addressing this problem.• It is also important for different communities to discuss and understand the value of their personal data and to know how they can be used, developed, and protected.• Evidence that learning analytics delivers benefits is not yet convincing to experts in the field.• Major investment of thoughtful effort is required in terms of research agendas and funding, policy and regulation, and developing and informing practice among all those who engage with learning analytics.
Abstract-An IMS Learning Design Expert Workshop was held at the University of Vienna on November 20 & 21,2008. This report contains a description of the purpose of the workshop, its methodologies and results. Participating experts first brainstormed visions and problems of IMS Learning Design (IMS LD), and then developed potential solutions to some of the identified problems. Three groups formed to work on two of the identified problems in more depth: the usability and utility problem, and the life cycle of a unit of learning problem. The proposed solutions regarding the usability and utility problem were to investigate how teachers' and learners' representations of a learning design can be brought together, and to set up a research program to identify how teachers cognitively proceed when designing courses and to map this knowledge to IMS LD. In regard to the life cycle of a unit of learning problem, the group suggested a system that continually exchanges information between runtime and editing systems so that units of learning can be updated accordingly.
The ethical context of Learning Analytics is framed by two related processes. Firstly, the amount of personal data available to organisations has been transformed by the computerisation and the subsequent development of the Internet. Secondly, the methods and ethical assumptions of Operations Research have been extended into new areas. Learning Analytics can be conceptualised as the extension of Operations Research methods to educational institutions, in a process facilitated by technological and social changes in the early twenty-first century. It is argued that the ethical discourse has viewed Learning Analytics as a discrete field, and focused on its internal processes, at the expense of its connections with the wider social context. As a result, contradictions arise in the practice of research ethics, and a number of urgent issues are not given due consideration. These include the partial erosion of the consensus around the Nuremberg code; the use of ethical waivers for quality improvement; the coercive extraction of data; the use of analytics as an enabling technology for management; and the educational implications of the relationship between surveillance and trust. Ethics and Learning AnalyticsAs a first step in considering the ethical context in which Learning Analytics (LA) is carried out, it is necessary to discuss the declared purpose and modus operandi of the field. Writing at an early stage in the development of LA, Long and Siemens, two of the principal actors in the field, explained the rationale for the development of LA, arguing that research indicates that "data-driven decision-making improves organizational output and productivity", and that education is falling behind other fields in this respect.Higher education, a field that gathers an astonishing array of data about its "customers," has traditionally been inefficient in its data use, …. Organizational processes-such as planning and resource allocation-often fail to utilize large amounts of data on effective learning
BackgroundDuring the past decade, the increasingly heterogeneous field of learning analytics has been critiqued for an over‐emphasis on data‐driven approaches at the expense of paying attention to learning designs.Method and objectiveIn response to this critique, we investigated the role of learning design in learning analytics through a systematic literature review. 161 learning analytics (LA) articles were examined to identify indicators that were based on learning design events and their associated metrics. Through this research, we address two objectives. First, to achieve a better alignment between learning design and learning analytics by proposing a reference framework, where we present possible connections between learning analytics and learning design. Second, to present how LA indicators and metrics have been researched and applied in the past.Results and conclusionIn our review, we found that a number of learning analytics papers did indeed consider learning design activities for harvesting user data. We also found a consistent increase in the number and quality of indicators and their evolution over the years.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.