BackgroundMedical students facing high-stakes exams want study resources that have a direct relationship with their assessments. At the same time, they need to develop the skills to think analytically about complex clinical problems. Multiple-choice questions (MCQs) are widely used in medical education and can promote surface learning strategies, but creating MCQs requires both in-depth content knowledge and sophisticated analytical thinking. Therefore, we piloted an MCQ-writing task in which students developed MCQs for their peers to answer.MethodsStudents in a fourth-year anatomic pathology course (N = 106) were required to write MCQs using the PeerWise platform. Students created two MCQs for each of four topic areas and the MCQs were answered, rated and commented on by their classmates. Questions were rated for cognitive complexity and a paper-based survey was administered to investigate whether this activity was acceptable, feasible, and whether it promoted desirable learning behaviours in students.ResultsStudents were able to create cognitively challenging MCQs: 313/421 (74%) of the MCQs which we rated required the respondent to apply or analyse pathology knowledge. However, students who responded to the end-of-course questionnaire (N = 62) saw the task as having little educational value. Students found PeerWise easy to use, and indicated that they read widely to prepare questions and monitored the quality of their questions. They did not, however, engage in extensive peer feedback via PeerWise.ConclusionsOur study showed that the MCQ writing task was feasible and engaged students in self-evaluation and synthesising information from a range of sources, but it was not well accepted and did not strongly engage students in peer-learning. Although students were able to create complex MCQs, they found some aspects of the writing process burdensome and tended not to trust the quality of each other’s MCQs. Because of the evidence this task did promote deep learning, it is worth continuing this mode of teaching if the task can be made more acceptable to students.
The flipped classroom approach has been used in health profession education to encourage active learning, but predominantly pre-learning flipped activities are passive knowledge input, such as watching recorded lectures. To encourage active learning, online learning activities focusing on knowledge acquisition and application were developed as pre-work for the flipped classroom in a pathology course in an undergraduate medical programme. These included labeling macroscopic pathology and histology images, case scenarios and multiple-choice questions. Student responses indicated that some of these were helpful, while some interfered with learning.In addition to the pedagogical issues associated with designing a flipped classroom, instructors also need to consider the technical elements of designing pre-course material for independent study. This paper examines the components of this flipped classroom's pre-work and highlights successful and unsuccessful e-learning components.
Cognitively complex assessments encourage students to prepare using deep learning strategies rather than surface learning, recallbased ones. In order to prepare such assessment tasks, it is necessary to have some way of measuring cognitive complexity. In the context of a student-generated MCQ writing task, we developed a rubric for assessing the cognitive complexity of MCQs based on Bloom's taxonomy. We simplified the six-level taxonomy into a three-level rubric. Three rounds of moderation and rubric development were conducted, in which 10, 15 and 100 randomly selected student-generated MCQs were independently rated by three academic staff. After each round of marking, inter-rater reliability was calculated, qualitative analysis of areas of agreement and disagreement was conducted, and the markers discussed the cognitive processes required to answer the MCQs. Inter-rater reliability, defined by the intra-class correlation coefficient, increased from 0.63 to 0.94, indicating the markers rated the MCQs consistently. The three-level rubric was found to be effective for evaluating the cognitive complexity of MCQs generated by medical students.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.