The MOOC (Massive Open Online Course) is the latest buzzword in distance education, bringing together, for the first time, high quality 'Ivy League' providers, online education and a low-cost model.MOOCs align with the approaches to teaching and learning advocated by the Kronberg Declaration (UNESCO, 2007), whereby learners 'play an ever more active role in knowledge acquisition and sharing' with 'the role of teachers and instructors decreasing'. While the basic design of individual MOOCs may not have moved too far beyond traditional pedagogical approaches, being largely lecture-based in format, the concept itself is learner-centred, liberating students to access and engage with education.However, to date, it would appear that MOOCs are falling short of these lofty ambitions for a radical shift in the way that education is delivered. The evidence strongly suggests that any reasonable measure of learner engagement in MOOCs is underwhelming when compared to the massive number of enrolments that many have secured.With these issues in mind, the authors this summer (2013) ran an eight-week MOOC, which generated data for a randomised control trial that considers the impact of three factors on student engagement: 1) Workload 2) Task design 3) Level of and nature of facilitation As part of the design, five different user groups were randomly created to investigate the contribution of each of the three factors above to the level of student engagement and to determine which, if any, had the greatest significance. Engagement was measured by tracking user activity and the collection of student feedback. This paper reports on the initial results of this experiment in MOOC delivery.
Students are learning to interact with, design for, and sometimes even program computers at earlier and earlier ages. Teachers and researchers can relatively easily measure progress in learning programming tasks, but assessing conceptual understanding of computation, mainly when programming tasks are not present, is less defined or non-existent. Computational Thinking (CT) generally refers to knowledge and skills apart from, and possibly a precursor to, the ability to write computer programs, yet is commonly measured through the quality of programming. Are there ways of qualifying CT 'maturity' outside of programming tasks? This study looks at the intersection of CT and CS in first-grade learners who are developing computational solutions involving literacy tasks. Students retell a story by animating characters in Scratch Jr. by breaking down the story, creating an animation storyboard, and finally implementing the plan in Scratch Jr. For most of the participants, this is their first time using Scratch Jr. or any programming language. Therefore, their early experience with technology means they are working on an analysis of a story using literacy skills, considering a visual representation of the story, and learning how to realize the expression of the storyline using a computer language. To better understand the general concept of CT, we analyzed video data of the students building their animated story, along with the artifacts of design and code. This qualitative data provides indicators of how their thinking progresses from little to at least working knowledge of writing a program. Using qualitative analysis, we evaluate how their initial design evolves, simplifies, or otherwise changes as the novices move from general problem solving to creating a computational solution. CT seems to be a specialized version of design and engineering design in general. Understanding how these first graders move from generally approaching a problem to a computational solution may give insight into how CT matures and the role of programming in that maturity. By capturing how a design evolves, we can suggest how CT can be 'seen' in artifacts produced that are not programming code. The goal is to find markers of CT in design and language that transcends specific programmed implementations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.