The prevalent use of student ratings in teaching evaluations, particularly the reliability of such data, has been debated for many years. Reports in the literature indicate that there are many factors in uencing student perceptions of teaching. Three of these factors were investigated at the University of Western Australia, namely the broad discipline group, course/unit year level and student gender. Data collected over 3 years were analysed. The outcomes of this study con rmed results reported by other workers in the eld that there are differences in ratings of students in different discipline groups and at different year levels. It also provided a possible explanation for the mixed results reported in studies of student gender in relation to student ratings.
Purpose – The paper aims to disseminate solutions to common problems in student evaluation processes. It proposes that student evaluation can be applied to quality assurance and improving learning and teaching. The paper presents solutions in the areas of: presenting outcomes as performance indicators, constructing appropriate surveys, improving response rates, reporting student feedback to students and student engagement as a feature of university quality assurance. Design/methodology/approach – The research approach of this paper is comparative case study, allowing in-depth exploration of multiple perspectives and practices at seven Australian universities. Process and outcome data were rigorously collected, analysed, compared and contrasted. Findings – The paper provides empirical evidence for student evaluation as an instrument of learning and teaching data analysis for quality improvement. It suggests that collecting data about student engagement and the student experience will yield more useful data about student learning. Furthermore, findings indicate that students benefit from more authentic inclusion in the evaluation process and outcomes. Research limitations/implications – Because of the chosen research approach, the research results may lack generalisability. Therefore, researchers are encouraged to test the proposed propositions further and apply to their own university contexts. Practical implications – The paper includes recommendations at the institution- and sector-wide levels to effectively use student evaluation as a university performance indicator and as a tool of change. Originality/value – This paper fulfils an identified need to examine student evaluation processes across institutions and focuses on the role of student evaluation in quality assurance.
Purpose This paper aims to report the findings of a study into the automated text analysis of student feedback comments to assist in investigating a high volume of qualitative information at various levels in an Australian university. It includes the drawbacks and advantages of using selected applications and established lexicons. There has been an emphasis on the analysis of the statistical data collected using student surveys of learning and teaching, while the qualitative comments provided by students are often not systematically scrutinised. Student comments are important, as they provide a level of detail and insight that are imperative to quality assurance practices. Design/methodology/approach The paper outlines the process by which the institution researched, developed and implemented the automated analysis of student qualitative comments in surveys of units and teaching. Findings The findings indicated that there are great benefits in implementing this automated process, particularly in the analysis of evaluation data for units with large enrolments. The analysis improved efficiency in the interpretation of student comments. However, a degree of human intervention is still required in creating reports that are meaningful and relevant to the context. Originality/value This paper is unique in its examination of one institution’s journey in developing a process to support academics staff in interpreting and understanding student comments provided in surveys of units and teaching.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.