Context High‐stakes medical examinations seek to be fair to all candidates, including an increasing proportion of trainee doctors with specific learning differences. We aimed to investigate the performance of doctors declaring dyslexia in the clinical skills assessment (CSA), an objective structured clinical examination for licensing UK general practitioners. Methods We employed a cross‐sectional design using performance and attribute data from candidates taking the CSA between 2010 and 2017. We compared candidates who declared dyslexia (‘early’ before their first attempt or ‘late’ after failing at least once) with those who did not, using multivariable negative binomial regression investigating the effect of declaring dyslexia on passing the CSA, accounting for relevant factors previously associated with performance, including number of attempts, initial score, sex, place of primary medical qualification and ethnicity. Results Of 20 879 CSA candidates, 598 (2.9%) declared that they had dyslexia. Candidates declaring dyslexia were more likely to be male (47.3% versus 37.8%; p < 0.001) and to have a non‐UK primary medical qualification (26.9% versus 22.4%; p < 0.01), but were no different in ethnicity compared with those who never declared dyslexia. Candidates who declared dyslexia late were significantly more likely to fail compared with those candidates who declared dyslexia early (40.6% versus 9.2%; p < 0.001) and were more likely to have a non‐UK medical qualification (79.3% versus 15.6%; p < 0.001) or come from a minority ethnic group (84.9% versus 39.2%; p < 0.001). The chance of passing was lower for candidates declaring dyslexia compared to those who never declared dyslexia and lower in those declaring late (incident rate ratio [IRR], 0.82; 95% confidence interval [CI], 0.70–0.96) compared with those declaring early (IRR, 0.95; 95% CI, 0.93–0.97). Conclusions A small proportion of candidates declaring dyslexia were less likely to pass the CSA, particularly if dyslexia was declared late. Further investigation of potential causes and solutions is needed.
Background: Substantial numbers of medical students and doctors have specific learning difficulties (SpLDs) and failure to accommodate their needs can disadvantage them academically. Evidence about how SpLDs affect performance during postgraduate general practice (GP) specialty training across the different licencing assessments is lacking. We aimed to investigate the performance of doctors with SpLDs across the range of licencing assessments.Methods: We adopted the social model of disability as a conceptual framework arguing that problems of disability are societal and that barriers that restrict life choices for people with disabilities need to be addressed. We used a longitudinal design linking Multi-Specialty Assessment (MSRA) records from 2016 and 2017 with their Applied Knowledge Test (AKT), Clinical Skills Assessment (CSA), Recorded Consultation Assessment (RCA) and Workplace Based Assessment (WPBA) outcomes up to 2021. Multivariable logistic regression models accounting for prior attainment and demographics were used to determine the SpLD doctors' likelihood of passing licencing assessments. Results:The sample included 2070 doctors, with 214 (10.34%) declaring a SpLD.Candidates declaring a SpLD were significantly less likely to pass the CSA (OR 0.43, 95% CI 0.26, 0.71, p = 0.001) but not the AKT (OR 0.96, 95% CI 0.44, 2.09, p = 0.913) or RCA (OR 0.81, 95% CI 0.35, 1.85, p = 0.615). Importantly, they were significantly more likely to have difficulties with WPBA (OR 0.28, 95% CI 0.20, 0.40, p < 0.001). When looking at licencing tests subdomains, doctors with SpLD performed significantly less well on the CSA Interpersonal Skills (B = À0.70, 95% CI À1.2, À0.19, p = 0.007) and the RCA Clinical Management Skills (B = À1.68, 95% CI À3.24, À0.13, p = 0.034). Conclusions: Candidates with SpLDs encounter difficulties in multiple domains of the licencing tests and during their training. More adjustments tailored to their needs should be put in place for the applied clinical skills tests and during their training.
T he aim of the Clinical Skills Assessment (CSA) is to test a doctor's ability to gather information, apply learned understanding of disease processes, and offer appropriate person-centred care. Effective integration of these skills is a key element in the exam, and not commonly tested in other disciplines. The CSA is a type of Objective Structured Clinical Examination (OSCE). Most medical OSCEs have stations each relating to one skill set, such as 'communication' or 'physical examination'. Although that may be more relevant in secondary care, where patients arrive already partly sifted, the primary care doctor is faced daily with undifferentiated, often multiple, problems experienced by patients who vary as much as their symptoms. The GP's task is complex, and best considered as an integration of skill sets. The CSA is an exit exam, testing for evidence of competence of independent practice as a family doctor; hence, we use the entire consultation as our unit of testing. In this article I set out to demystify consulting in the CSA. Throughout, I offer practical tips; these are marked *. I end with a list of suggestions to better prepare candidates for what we all hope will be a good experience. The GP curriculum and consulting in the clinical skills assessment Professional module 2.01: The GP consultation in practice requires GPs to:. Show a commitment to patient-centred medicine. Balance the patient's values and preferences with the best available evidence. Demonstrate clear, sensitive and effective communication. Manage complexity and uncertainty within the limited time available
Our previous three articles reviewed the applied knowledge test (AKT) offering insight, advice and guidance from an examiner's and trainee's perspective. We now move on to the clinical skills assessment (CSA) and intend to follow similar themes with advice from both the Clinical Lead for the CSA and also a trainee who has recently undertaken the assessment. In this first article, Dr Nicki Williams provides an insider's overview of the exam. Next month we will look at the application process, how to prepare for the CSA and examiner top tips.
We thank Drs Bhatti and Nayar for responding to our study. 1 They present no evidence to contradict our findings and, despite misunderstanding our analysis and interpretation, reach similar conclusions.Our study challenges their assertion, that ethnic minority trainees, in particular UK-trained ethnic minority doctors in GP specialty training, fail MRCGP because of their ethnicity. We showed that this was not the case in our cohort. 2 Their focus on racial discrimination in the workplace and during training implies the non sequitur that differential attainment must be due to unfair discrimination by examiners and examinations, or educators in the case of workplace-based assessment. In doing so they denigrate the many ethnic minority doctors in specialty training who pass MRCGP, supported by educators.Increasing numbers of ethnic minority and overseas-qualified doctors complete the MSRA, a computer-marked assessment of clinical knowledge and judgement, and enter specialty training for general practice. They claim that we 'do not seem to have … taken into account … differential attainment in the MRSA exam', but this is exactly what we have done.The GMC report Tackling Disadvantage in Medical Education, which shows differential attainment in trainees in all specialties, by separately analysing characteristics such as ethnicity, gender, and disability, 3 does not contradict our findings. We used multivariable models taking into account intersections between these attributes to elucidate independent predictors of performance in licensing assessments. Attempts to conflate differential attainment with racial discrimination in assessments, could itself stereotype doctors and will do little to improve their self-worth or educational outcomes.Fair Training Pathways for All 4 explores the importance of the educational environment, and we welcome educational initiatives to reduce differential attainment, but these do not undermine the reliability of
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.