No abstract
Background Identification and assessment of professional competencies for medical students is challenging. We have recently developed an instrument for assessing the essential professional competencies for medical students in Problem-Based Learning (PBL) programs by PBL tutors. This study aims to evaluate the reliability and validity of professional competency scores of medical students using this instrument in PBL tutorials. Methods Each group of seven to eight students in PBL tutorials (Year 2, n = 46) were assessed independently by two faculty members. Each tutor assessed students in his/her group every five weeks on four occasions. The instrument consists of ten items, which measure three main competency domains: interpersonal, cognitive and professional behavior. Each item is scored using a five-point Likert scale (1 = poor, 5 = exceptional). Reliability of professional competencies scores was calculated using G-theory with raters nested in occasions. Furthermore, criterion-related validity was measured by testing the correlations with students’ scores in written examination. Results The overall generalizability coefficient (G) of the professional competency scores was 0.80. Students’ professional competencies scores (universe scores) accounted for 27% of the total variance across all score comparisons. The variance due to occasions accounted for 10%, while the student-occasion interaction was zero. The variance due to raters to occasions represented 8% of the total variance, and the remaining 55% of the variance was due to unexplained sources of error. The highest reliability measured was the interpersonal domain (G = 0.84) and the lowest reliability was the professional behavior domain (G = 0.76). Results from the decision (D) study suggested that an adequate dependability (G = 0.71) can be achieved by using one rater for five occasions. Furthermore, there was a positive correlation between the written examination scores and cognitive competencies scores (r = 0.46, P < 0.01), but not with the other two competency domains (interpersonal and professionalism). Conclusions This study demonstrates that professional competency assessment scores of medical students in PBL tutorials have an acceptable reliability. Further studies for validating the instrument are required before using it for summative evaluation of students by PBL tutors.
Purpose The purpose of this paper is to identify essential profession-related competencies, clinical knowledge and skills that medical students should develop in the early stages of their education for future professional practice. Design/methodology/approach A literature review and workshop resulted in a list of 46 crucial profession-related competencies. The first round of the modified Delphi survey (feedback questionnaire) involved experts who identified 26 items (via a Likert scale). The second round of the modified Delphi survey by faculty members highlighted ten items. Statistical analysis yielded four domains with items clustered as follows: interpersonal competencies (communication and collaboration), cognitive skills (problem solving, critical thinking and reflectivity), work-related skills (planning and time management) and professionalism (integrity, sense of responsibility, respect and empathy). Findings In conclusion, the results of this study provide insights and implications surrounding the competencies that are essential for assessment and facilitation in the early stages of a medical curriculum. The study also predicts the challenges of facilitating and assessing these competencies, as pointed out in recent literature. In general, outcomes of the study suggest that instead of categorizing the competencies, it is more meaningful to take a holistic and integrated approach in order to conceptualize, facilitate and assess these competencies in context of the complexities of real-life situations. Originality/value Ten items were identified as essential profession-related competencies that should be incorporated during the early stages of medical education. Six out of the ten items were agreed upon by all participants of the study: collaboration, communication, problem solving, integrity, responsibility and respect. This list aligns with the existing literature and graduate attributes internationally. Items related to planning and time management, critical thinking and reflectivity were regarded as specifically lacking and important areas of improvement for Arabic students. Divergence on items of empathy and medical ethics were observed among international and local panels, with the main concern, raised by medical faculty, being how to facilitate and assess these items. The competencies identified mandate reforms in the medical school curricula in an attempt to implement essential skills early in medical student’s career.
Objectives Continuous formative assessment with appropriate feedback is the pillar of effective clinical teaching and learning. Group Objective Structured Clinical Examination (GOSCE) has been reported as a resource-effective method of formative assessment. The present study aims to describe the development and evaluation of GOSCE as a formative assessment for pre-clerkship medical students. Methods At the University of Sharjah, GOSCE was introduced to medical students in Years 1, 2, and 3. The GOSCE was conducted as a formative assessment in which groups of 4–5 students were observed while they performed various clinical skills, followed by structured feedback from clinical tutors and peers. GOSCE was evaluated both quantitatively and qualitatively and appropriate statistical analysis was applied to evaluate their responses. Results A total of 232 students who attended the GOSCE responded to the questionnaires. Most of the students and clinical tutors preferred formative GOSCE over individual feedback. Both students and clinical tutors valued the experience as it helped students to identify gaps and to share knowledge and skills among group members. Conclusion This study found that formative GOSCE provided a valuable and feasible educational opportunity for students to receive feedback about their clinical skills.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.