2014
DOI: 10.18608/jla.2014.12.3
|View full text |Cite
|
Sign up to set email alerts
|

Epistemology, Assessment, Pedagogy: Where Learning Meets Analytics in the Middle Space

Abstract: Abstract:Learning Analytics is an emerging research field and design discipline that occupies the "middle space" between the learning sciences/educational research and the use of computational techniques to capture and analyze data (Suthers & Verbert, 2013). We propose that the literature examining the triadic relationships between epistemology (the nature of knowledge), pedagogy (the nature of learning and teaching), and assessment provide critical considerations for bounding this middle space. We provide exa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
91
0
2

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 135 publications
(94 citation statements)
references
References 59 publications
1
91
0
2
Order By: Relevance
“…Furthermore, data literacy and use are highly relevant for both student teachers and practicing teachers in order to keep up with an increasing focus on formative e-assessment that uses educational data mining and learning analytics methods (e.g. Knight, Buckingham Shum & Littleton, 2014;Kickmeir-Rust et al, 2014;Vatrapu, Reimann, Hussain & Pantazos 2016).…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, data literacy and use are highly relevant for both student teachers and practicing teachers in order to keep up with an increasing focus on formative e-assessment that uses educational data mining and learning analytics methods (e.g. Knight, Buckingham Shum & Littleton, 2014;Kickmeir-Rust et al, 2014;Vatrapu, Reimann, Hussain & Pantazos 2016).…”
Section: Discussionmentioning
confidence: 99%
“…In relation to education specifically, there is burgeoning research on learning analytics and their potential to enhance provision for students, particularly in relation to higher education (Buckingham Shum 2014;Finn 2015;Gasevic, Dawson, and Siemens 2015). Within this realm, there are significant efforts to examine issues, such as how code and algorithms reflect and relate to specific understandings of pedagogy and assessment (Knight, Buckingham Shum, and Littleton 2014) and issues of student privacy (Prinsloo and Slade 2015). There are also attempts to develop greater transparency through the use of open source software and through participatory design approaches to the development of analytics.…”
Section: Standardisation Software Algorithms and Inscrutable Practicesmentioning
confidence: 99%
“…Understanding this full cycle is crucial, and it is to this issue that we turn in the section "The Learning Analytics Lifecycle." While DCLA of various sorts may be productive, we should be aware of the interpretive flexibility of the tools we develop, and the pedagogic, assessment, and epistemological context of which they form a part (Knight et al, 2014). However, if we are able to operationalize some stable and general categories and patterns in discourse use, then we can model such patterns in classifiers, recognizing that such modelling involves a balance around acceptable levels of information loss (Rosé & Tovares, 2015).…”
Section: Enabling Sharing Of Ideas That Can Be Improved Togethermentioning
confidence: 99%
“…However, their use may also make commitments to particular pedagogic, assessment, and epistemological assumptions (Knight et al, 2014). While tools have particular affordances (for example, twitter replies, forum threading, knowledge-mapping connections), these tools can be used in many learning contexts, and their data analyzed for many purposes.…”
Section: What Systems Need To Be In Place For the Successful Deploymementioning
confidence: 99%