The pedagogical approach for both didactic and laboratory teaching of anatomy has changed in the last 25 years and continues to evolve; however, assessment of student anatomical knowledge has not changed despite the awareness of Bloom's taxonomy. For economic reasons most schools rely on multiple choice questions (MCQ) that test knowledge mastered while competences such as critical thinking and skill development are not typically assessed. In contrast, open-ended question (OEQ) examinations demand knowledge construction and a higher order of thinking, but more time is required from the faculty to score the constructed responses. This study compares performances on MCQ and OEQ examinations administered to a small group of incoming first year medical students in a preparatory (enrichment) anatomy course that covered the thorax and abdomen. In the thorax module, the OEQ examination score was lower than the MCQ examination score; however, in the abdomen module, the OEQ examination score improved compared to the thorax OEQ score. Many students attributed their improved performance to a change from simple memorization (superficial learning) for cued responses to conceptual understanding (deeper learning) for constructed responses. The results support the view that assessment with OEQs, which requires in depth knowledge, would result in student better performance in the examination. Anat Sci Educ 11: 254-261. © 2017 American Association of Anatomists.
The COVID‐19 pandemic resulted in rapid changes to delivery of medical school exams during the height of the pandemic. Remote, online exams quickly became the norm, with many schools initially allowing open book (OB) exams due to a lack of policies and resources to monitor for cheating and as a means of reducing student anxiety that was heightened by the uncertainty of the pandemic and its effect on their education. A 2016 review by Durning, et al. concluded that there was no significant difference in exam performance between OB and closed book (CB) exams; however, a recent study found significantly higher scores on an OB exam, compared to a CB exam (Eurboonyanun, et al., 2021). At Cooper Medical School of Rowan University (CMSRU), gross anatomy practical examinations are taken using cadavers and are CB. As a result of the pandemic, the Class of 2023 was administered the Skin and Musculoskeletal (SMS) midcourse and final practical examinations as remote, online OB exams. The next year, the Class of 2024 took the same exams remotely online, as CB exams. The aim of this study was to evaluate the impact of OB exam conditions on students' online practical exam performance. We hypothesized that the Class of 2023’s mean score on both exams would be significantly higher than the Class of 2024. We compared the Class of 2023’s (n = 111) remote, online OB mean scores for the midcourse and final gross anatomy practical summative exams to the Class of 2024’s (n = 110) remote, online CB exam scores for the same practical exams. The SMS course is the last course for 1st year medical students at CMSRU. For both classes, the midcourse and final practical exam questions were delivered remotely online using cadaveric atlas images. Questions were open‐ended and consisted of a mixture of 1st, 2nd and 3rd order questions. The number of questions on the midcourse and final practical exams for both classes were equivalent. A one‐tailed, non‐paired t‐test was used to compare mean class performance on both the midcourse and final practical exams. The Class of 2023 had a significantly higher summative mean score on the midcourse practical exam, compared to the Class of 2024 (96.91 v 90.24, p<0.000). The summative mean score for the final practical exam was significantly higher for the Class of 2023 compared to the Class of 2024 (93.21 v 89.26, p<0.000). First year medical students who took remote online OB midcourse and final practical exams in the SMS course outperformed 1st year medical students who took equivalent exams that were CB. The COVID‐19 pandemic has opened a window for educators to give greater consideration to and allow for exploration of the use of OB exams in anatomy, regardless of whether exams are delivered online or in‐person. In a world where knowledge is available instantaneously with just a few strokes of a keypad, why is it we still expect students to commit detailed anatomical knowledge to memory for testing? As practicing clinicians, they will have access to resources to help them answer questions they are unsure ...
The sternalis is an uncommon, variant muscle of the anterior thoracic wall that is estimated to be present in 8% of the human population. Students in a medical gross anatomy course were fortunate to discover a right, unilateral sternalis muscle during dissection of a 76‐year‐old female Caucasian cadaver. The sternalis appeared as a ribbon‐like strap and measured 15 cm in length with an average width of 2.5 cm. It was oriented on the anterior thoracic wall along the right margin of the sternum and medial to the sternocostal head of the right pectoralis major muscle. Despite its rarity, radiologists must be aware of the possibility of encountering the sternalis during thoracic imaging (CT scans, mammography, MRI) because of the risk for its misdiagnosis as a tumor. Further, risk for surgical complications such as damage to this muscle during breast surgery must be considered. Thus, although it may be difficult to perform a proper anatomic analysis of the infrequently observed sternalis muscle, it is important that students and clinicians be aware of its existence because of its potentially significant impact on clinical diagnosis and patient management. Therefore, anatomic studies through prosections, illustrations, photographs, diagnostic images, and detailed descriptions are warranted to increase awareness of the sternalis muscle and its variations among clinicians (especially radiologists and surgeons).
Anatomy teaching in medical school has changed in the last few decades. However, assessing students’ knowledge has not changed despite the awareness of Bloom's taxonomy. For economical reason we rely on multiple choice questions‐MCQ (selected response) that only tests knowledge. However, the psychometric properties such as critical thinking and skill development which are part of academic achievement and progress are not assessed. Contrarily, constructed response testing demands knowledge construction and higher level of thinking; however, for faculty more time consuming to score. We conducted a pilot study. We made a MCQ exam which was also formatted as short answer that requires content specific knowledge and critical thinking. We administered this approach to a small group of incoming first year medical students taking preparatory anatomy course. In the midterm exam, constructed exam score was much lower compared to selected response score. However, in the final exam their score in the constructed exam improved considerably compared to the midterm score. Student reported that their improved performance was due to changes in their study and learning anatomy than relying on being familiar with the material. While this outcome is encouraging feasibility of this approach to a large class needs to be explored. Supported by the Office of Education and the Department of Cell Biology and Molecular Medicine.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.