Educational videos are becoming more prevalent within a higher education context and the use of videos is now taken for granted. However, the full impact videos have on learning is under researched and not fully known. This study was conducted to investigate the effectiveness of quiz questions embedded throughout a video. Students from different modules (n 1 = 102, n 2 = 23) watched three different formats of videos and subsequent results of a multiple choice test were recorded and compared. In addition, viewing behaviour was recorded and explored to evaluate if this also impacted upon results. Results highlighted, that the performance on tests significantly improved after watching the video with embedded quiz questions throughout. Contrary to the test scores, students' perceptions did not identify any differences, however students' qualitative comments showed overwhelming support for quizzes embedded throughout a video. Implications on professional practice and further research to build upon this study are discussed.
Undergraduate students are often required to collect survey data as part of their studies, but they rarely receive any detailed guidance on choosing an appropriate free online survey tool. In addition, many universities do not provide undergraduate students with an institutionally supported and managed online survey tool. Because there are so many online survey services available, the lack of an institutionally managed survey tool coupled with a lack of proper guidance on their selection and use can cause a great deal of stress and possible expense to students. In order to alleviate this problem, ten prominent free online survey services were reviewed in order to give students, particularly undergraduate students in higher education, some guidance in this matter. Three essential criteria were borne in mind when evaluating the tools: ease of use; ability to export data, and; UK Data Protection Act compliance. Although this paper is predominantly focused on UK students undertaking surveys which collect data that could personally identify a respondent, conclusions are generalised to include recommendations for surveys collecting non-personally identifiable data, and for students studying outside of the UK. Based on the findings of the review, students needing to use a free online survey tool are recommended to use eSurv for all surveys, unless they are given alternative directions by academic staff or others at their institution. In addition, we further recommend that both eSurv and Quick Surveys are appropriate for surveys collecting non-personally identifiable data.
Given the current popularity of educational videos, and given the time, effort and expense academics and institutions are investing to provide educational videos to students, it was thought worthwhile to evaluate whether students at the University of Northampton (UoN) actually want and use these resources. Moreover, if it was found they do use educational videos, investigation was required to determine if they are in a format that students want. The study was carried out in two distinct stages. The first stage was a questionnaire which was followed by a focus group. It was found that students at Northampton do overwhelmingly use educational videos. Furthermore, the research found that students prefer videos to any other resource and that videos can increase motivation. Additionally, high-risk production strategies, such as seeing the presenter on screen, and the use of animation, humour and quizzes were identified, and it was found that the use of music in an educational video was considered a negative component of a video. The optimum length of the video is less clear, however, it is recommended they are kept to less than 10 minutes (although this is dependent upon the level of study of the student). The key recommendation when producing videos is to ensure they have been designed taking cognitive research into account. The key strength of a well-designed educational video, it is concluded, is to give the students something additional they cannot find in another resource, in a way which encourages effective learning.
The Learning Development (LD) team at the University of Northampton comprises specialist tutors who provide advice and guidance to all students on academic and study skills. This advice is delivered through one-to-one tutorials, embedded workshops, drop-in sessions and their online Skills Hub.A research project was initiated to ascertain the impact of their work. Measuring impact is challenging and is the perennial problem within the global Learning Development community.The project aimed to:• assess student awareness of the service • identify the reasons why students use LD, or choose not to use it • measure the effectiveness and impact of LD on the students who use the service compared to those who do not• estimate the impact of the LD service on student retention Over the period of an academic year several data collection methods were employed: reviewing longitudinal data from undergraduate student assessments (from 16,194 students) over three years; analysing a questionnaire with responses from over 250 current students taken from the entire student population; collating 161 questionnaires from students who have utilised the LD team and undertaking semi-structured interviews with current students. This chapter outlines the impact of the LD team upon student learning and academic development. It examines the importance of the role of a LD service, reviews the focus of the team upon aligning its work to that of faculty colleagues to ensure that academic skills are embedded in the curriculum. Finally, it puts forward an approach to measure the impact of Learning Development as a discipline on the retention and progression of UK HE students.
This project uses magic to explore dissertation skills with students. Students in a session on preparing for the dissertation learnt a magic trick and then used their experience of learning the trick to reflect and to develop narratives around their dissertation topic focussing on the skills of researching and writing. We compared the results of the intervention group to those of a control group (who were given the same session but excluding the magic trick). The teaching sessions integrated skills essential for completing the dissertation such as critical thinking, linking, metacognitive reflection, and conceptualising the process of a long project. Previous research has suggested that using magic can stimulate curiosity, engage and motivate students, and that they will find the session more memorable (see Moss, Irons and Boland, 2017; Wiseman and Watt, 2020; Wiseman, Wiles and Watt, 2021) The presentation reported the findings from pre- and post-session questionnaires completed by participants to evaluate the use of a magic trick in teaching dissertation skills by: Evaluating the effectiveness of using a magic trick to teach dissertation skills. Evaluating the use of magic to make skills teaching more memorable. Evaluating the use of magic to support motivation and positive emotions around dissertation tasks. Evaluating the use of magic to counter some of the negative affects students encounter such as lack of motivation or negative self-efficacy beliefs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.