The purpose of this study was to evaluate the ability of specific types of multiple-choice questions delivered using an Audience Response System (ARS) to maintain student attention in a professional educational setting. Veterinary students (N=324) enrolled in the first three years of the professional curriculum were presented with four different ARS question types (knowledge base, discussion, polling, and psychological investment) and no ARS questions (control) during five lectures presented by 10 instructors in 10 core courses. Toward the end of the lecture, students were polled to determine the relative effectiveness of specific question types. Student participation was high (76.1%+/-2.0), and most students indicated that the system enhanced the lecture (64.4%). Knowledge base and discussion questions resulted in the highest student-reported attention to lecture content. Questions polling students about their experiences resulted in attention rates similar to those without use of ARS technology. Psychological investment questions, based on upcoming lecture content, detracted from student attention. Faculty preparation time for three ARS questions was shorter for knowledge base questions (22.3 min) compared with discussion and psychological investment questions (38.6 min and 34.7 min, respectively). Polling questions required less time to prepare (22.2 min) than discussion questions but were not different from other types. Faculty stated that the investment in preparation time was justified on the basis of the impact on classroom atmosphere. These findings indicate that audience response systems enhance attention and interest during lectures when used to pose questions that require application of an existing knowledge base and allow for peer interaction.
Large class sizes and often few instructors in anatomy courses make it challenging for student laboratory groups to have their questions addressed in a timely manner. Instructors are often unaware of the number of requests for assistance, as well as the order in which assistance is requested, and students often spend a long time waiting for an instructor to become available. As a result of brainstorming with some of our students, a call button system of sorts was suggested. Instructors in consultation with the college’s IT department came up with the idea of using Zoom Meetings question and answer (Q&A) feature to manage student questions. Zoom allows one to broadcast a Zoom Meeting to up to 50,000 participant attendees, and instructors, logged in as panelists (on a mobile device, e.g., iPad), can interact with the student attendees via the Q&A feature. The students join the webinar using their dissection table number as their ID and request assistance in the Q&A. These requests show up with a time stamp and are automatically queued on the panelist’s Q&A window. Instructors employ the type answer feature to acknowledge the question by typing in their respective initials, which can be seen live by the other instructors (panelists). This allows student questions to be queued so that the instructors can address them in a timely, first-in/first-out order. Student feedback regarding the use of this system for the Small Animal Anatomy course was positive.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.