Providing students with high quality feedback is important and can be achieved using computer‐based systems. While student and educator perspectives of such systems have been investigated, a comprehensive multidisciplinary study has not yet been undertaken. This study examines student and educator perspectives of a computer‐based assessment and feedback system at undergraduate and postgraduate levels across a range of disciplines, course sizes and educator experience levels. The results suggest that students and educators may enjoy quality, efficiency and quantity benefits from such an approach, without the drawbacks of low personalisation and specificity identified within the literature. Time and effort are, however, required by educators for training and installation. Thus, workload and training allowances are recommended for practitioners, particularly for large courses where group assignments require highly personalised feedback.
Assessment of student learning is a core function of educators. Ideally students should be provided with timely, constructive feedback to facilitate learning. However, provision of high quality feedback becomes more complex as class sizes increase, modes of study expand and academic workloads increase. ICT solutions are being developed to facilitate quality feedback, whilst not impacting adversely upon staff workloads. Hence the research question of this study is 'How do academic staff perceive the usefulness of an automated feedback system in terms of impact on workloads and quality of feedback?' This study used an automated feedback generator (AFG) across multiple tutors and assessment items within an MBA course delivered in a variety of modes. All academics marking in the course completed a survey based on an adaptation of the unified theory of acceptance and use of technology (UTAUT) model. Results indicated that while the workload impact was generally positive with savings in both cost and time, improvements and modifications to the system could further reduce workloads. Furthermore, results indicated that AFG improves quality in terms of timeliness, greater consistency between markers and an increase in the amount of feedback provided.
Timely, constructive feedback on assessment is critically important to students and yet is increasingly difficult for time-poor academics to consistently provide. Marketing educators also face pressure to incorporate sustainability into both the curriculum and practices such as assessment. This article outlines the development of an innovative computer-based marking tool, SuperMarkIt (SMI), which promotes sustainability. Using an action research method, this article reports the results of the sixth cycle of review and improvement with data gathered from both educators and students. Results support the improvements in sustainability achieved by using SMI. Educationally, students report improved feedback quality in areas such as timeliness, legibility, constructiveness, personalization, and detail. Economically, academics report major increases in feedback quantity, and students report reductions in time and cost. Environmentally, SMI reduces reliance on paper as well as reducing travel required to submit and collect hard copy assessments. Continued development of SMI focuses on improving aspects related to technology adoption by educators.
This study investigates the application of voice recognition technology to online lectures focusing on the efficacy of the text component of a multimedia presentation. Specifically, participants were provided with online access to multimedia instructional packages comprising an image of the lecturer with accompanying computer slides, plus simultaneous scrolling text of the words spoken during the lecture. Participants' knowledge was measured before and after the lecture presentation. Contrary to cognitive load theory, the results did not show a negative redundancy effect, that is, there were no differences in learning efficacy between the conditions with and without on-screen text. Further, participants found no difference between text edited for semantic breaks compared to unedited text. The implications for online instructional design are that resources are better spent providing a combination of audio and slides rather than text and slides, and that if text is provided then editing for semantic line breaks is not warranted.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.