BackgroundProgrammatic assessment that looks across a whole year may contribute to better decisions compared with those made from isolated assessments alone. The aim of this study is to describe and evaluate a programmatic system to handle student assessment results that is aligned not only with learning and remediation, but also with defensibility. The key components are standards based assessments, use of "Conditional Pass", and regular progress meetings.MethodsThe new assessment system is described. The evaluation is based on years 4-6 of a 6-year medical course. The types of concerns staff had about students were clustered into themes alongside any interventions and outcomes for the students concerned. The likelihoods of passing the year according to type of problem were compared before and after phasing in of the new assessment system.ResultsThe new system was phased in over four years. In the fourth year of implementation 701 students had 3539 assessment results, of which 4.1% were Conditional Pass. More in-depth analysis for 1516 results available from 447 students revealed the odds ratio (95% confidence intervals) for failure was highest for students with problems identified in more than one part of the course (18.8 (7.7-46.2) p < 0.0001) or with problems with professionalism (17.2 (9.1-33.3) p < 0.0001). The odds ratio for failure was lowest for problems with assignments (0.7 (0.1-5.2) NS). Compared with the previous system, more students failed the year under the new system on the basis of performance during the year (20 or 4.5% compared with four or 1.1% under the previous system (p < 0.01)).ConclusionsThe new system detects more students in difficulty and has resulted in less "failure to fail". The requirement to state conditions required to pass has contributed to a paper trail that should improve defensibility. Most importantly it has helped detect and act on some of the more difficult areas to assess such as professionalism.
An unexpected disruption that occurred close to examinations, but which had less physical environmental effect, had a greater impact on assessment performance than a more severe disruption and series of disruptions to which students had time to adapt and which they could work around. Two theories are offered to explain the observations.
BackgroundStress is associated with poorer academic performance but identifying vulnerable students is less clear. A series of earthquakes and disrupted learning environments created an opportunity to explore the relationships among stress, student factors, support and academic performance within a medical course.MethodsThe outcomes were deviations from expected performances on end of year written and clinical examinations. The predictors were questionnaire-based measures of connectedness/support, impact of the earthquakes, safety, depression, anxiety, stress, resilience and personality.ResultsThe response rate was 77 %. Poorer than expected performance on all examinations was associated with greater disruptions to living arrangements and fewer years in the country; on the written examination with not having a place to study; and on the clinical examination with relationship status, not having the support of others, less extroversion, and feeling less safe. There was a suggestion of a beneficial association with some markers of stress.ConclusionWe show that academic performance is assisted by students having a secure physical and emotional base. The students who are most vulnerable are those with fewer social networks, and those who are recent immigrants.
Construct: The MUSIC ® Inventory measures the construct of academic motivation across five factors: empowerment, usefulness, success, interest, and caring. The factors are defined in terms of the degree the student perceives that they have control over their environment, that the coursework is useful to their future, that they can succeed in the course, that the course and instructional methods are interesting, and that the teacher cares about their wellbeing and their success respectively. Background: A valid measure of medical students' academic motivation would provide medical teachers with a method for evaluating the motivational aspect of their course, and provide focus for changes in teaching and learning to improve medical student engagement. While the MUSIC ® Inventory structure has been validated in the tertiary setting and with several professional programs, it has not been validated with medical students. The aim of this study was to use both classical test theory and Rasch modelling to assess the reliability and confirm the structure of the five-factor model of the MUSIC ® Inventory with medical students.Approach: 152 medical students completed the 26-item inventory. Descriptive statistics, internal consistency, correlations between factors, confirmatory factor analysis, and Rasch analysis using the rating scale model were performed to determine reliability and validity.Findings: The five factors showed good internal consistency (Cronbach's alpha 0.87 -0.92). Correlations between factors were moderate to high (r=.38 -.89). Confirmatory factor analysis highlighted inconsistencies in factor loadings of three of the items hypothesized to measure interest. Rasch analysis using the rating scale model showed that all items for each factor had good item fit (0.65 -1.37). Person separation (2.28 -2.85) and reliability (.84 -.91) scores indicated that the scales were able to differentiate different levels of respondents. Item separation (2.25 -6.97) and reliability scores (.83 -.98) indicated that the items of the scales were being differentiated by the respondents.Conclusions: Rasch analysis indicates that the five factors of academic motivation measured by the MUSIC ® Inventory account for the response patterns in data from medical students. However, while the factors of empowerment, usefulness, success, and caring showed expected reliability and validity using classical analysis, three of the interest items cross-loaded on to the usefulness factor. Possible reasons may include ambiguity of language for the items or medical students' conception of usefulness and interest. Future research will explore medical students' understanding of the language used to measure these factors in further detail.
From a student perspective, the computer was an acceptable platform for delivering a formative assessment comprising multiple choice questions (MCQs).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.