Massive Open Online Courses (MOOCs) have gained a lot of attention in the last years as a new technologyenhanced learning (TEL) approach in higher education.MOOCs provide more educational opportunities to a massive number of learners to attend free online courses around the globe. Discussions around MOOCs have been focusing on the potential, social, institutional, technological, relevance, and marketing issues and less on the quality design of MOOC environments. Several studies have reported a high drop-out rate in average of 95% of course participants and other pedagogical problems concerning assessment and feedback. Thus, the quality of MOOCs design is worth additional investigation. Although several studies identified a large set of criteria to the successful design of TEL systems in general, not all of them can be used in the MOOC context, due to some unique features of MOOCs. This study is a first step towards identifying specific criteria that need to be considered when designing and implementing MOOCs. The results of this empirical study are based on a large survey targeting learners as well as professors, both with MOOC experience. As a result, we identified and rated 74 indicators classified into our two main dimensions of pedagogical and technological criteria distributed over six categories. From these, the learning analytics and assessment categories were found to be the key features for effective MOOCs.
In the past few years, there has been an increasing interest in Massive Open Online Courses (MOOCs) as a new form of Technology-Enhanced Learning (TEL), in higher education and beyond. Recognizing the limitations of standalone MOOCs, blended MOOCs (bMOOCs) that aim at bringing in-class (i.e. face-to-face) interactions and online learning components together have emerged as an alternative MOOC model of teaching and learning in a higher education context. In this paper, we present the design, implementation, and evaluation details of a bMOOC course on "Teaching Methodologies" at Fayoum University, Egypt in cooperation with RWTH Aachen University, Germany, provided using the bMOOC platform L 2 P-bMOOC. In order to gauge the usability and effectiveness of the course, we employed an evaluation approach based on Conole's 12 dimensions rubrics, ISONORM 9241/110-S as a general usability evaluation, and a custom effectiveness questionnaire reflecting the different MOOC stakeholder perspectives.
Learning analytics tools should be useful, i.e., they should be usable and provide the functionality for reaching the goals attributed to learning analytics. This paper seeks to unite learning analytics and action research. Based on this, we investigate how the multitude of questions that arise during technology-enhanced teaching and learning systematically can be mapped to sets of indicators. We examine, which questions are not yet supported and propose concepts of indicators that have a high potential of positively influencing teachers' didactical considerations. Our investigation shows that many questions of teachers cannot be answered with currently available research tools. Furthermore, few learning analytics studies report about measuring impact. We describe which effects learning analytics should have on teaching and discuss how this could be evaluated.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.