Peer Instruction is a popular method of implementation when using Student Response Systems (SRS) in classroom teaching. The students engage in peer discussion to solve conceptual multiple choice problems. Before discussion, students are given time to think and give individual responses with a voting device. In this paper, we investigate how this initial voting session affects students' experiences of the following discussion. The data is based on student interviews which were analyzed using analytical tools from grounded theory. The students emphasize the individual thinking period as crucial for constructing explanations, argumentation, and participation during discussions, and hence for facilitating learning. However, displaying the results from the initial vote can be devastating for the quality of the discussions, especially when there is a clear majority for a specific alternative. These findings are discussed in light of recent quantitative studies on Peer Instruction.
In this article, we describe and discuss the most significant teacher-centric aspects of student response systems (SRS) that we have found to negatively affect students' experience of using SRS in lecture settings. By doing so, we hope to increase teachers' awareness of how they use SRS and how seemingly trivial choices or aspects when using SRS can have a significant negative impact on students' experiences, especially when these aspects are often repeated. We cover areas such as consistency when using SRS, time usage, preparation, the experience level of the teachers with regard to SRS, teacher commitment and attitudes, teacher explanations, and how students fear that voting results can mislead the teacher. The data are based on 3 years of experience in developing and using an online SRS in classroom lectures, and they consist of focused (semistructured) student group interviews, student surveys and personal observations. Keywords: audience response systems; clickers; student attitudes; teaching pitfalls Introduction A student response system (SRS) can be described as an electronic voting system that presents students with a multiple-choice question, often as part of a quiz, to which they will answer with a small handheld device (commonly referred to as a ''clicker''), usually after engaging in peer discussions. Benefits from utilising such a system in lectures include increased student motivation and engagement ( With such a promising record, it is easy to forget that SRS is only a tool that, if not used correctly, can actually be detrimental to the lectures (Barber and Njus 2007). Focusing primarily on the technology with a belief that the technology will automatically improve lectures, instead of focusing on how students think and learn, is the single most important reason for failure when implementing new technology into education (Mayer 2005), and SRS is no exception.There are several publications that give best practice guidelines for using SRS (e.g. Although this seems to be an elaborate list, these negative aspects are mostly briefly mentioned and not described in depth in the literature (for a list of general challenges with SRS, see Kay and LeSage 2009). In-depth studies on these aspects and how they affect students' experience of SRS are, to the authors' knowledge, lacking. To fully understand and appreciate best practice guides on SRS, it is important to understand how and why different aspects of implementation can have a negative impact on students.In this article, we describe and discuss such aspects after 3 years of experience (since 2009) in developing and using an online SRS for modern handheld devices, such as smartphones, at Sør-Trøndelag University College in Norway. We start this article with background information, describing the classes where SRS was used and the implementation choices for the different years of testing. This is followed by a brief description of research methods and a presentation of our results. We conclude this article with a discussion and conclusions.
The authors have compared students discussing multiple-choice quizzes during peer instruction with and without the initial thinking period before discussion. Video clips of students engaged in peer discussion in groups of three of varying group combinations, a total of 140 different students in all, were compared to students' own experiences extracted from group interviews (16 students in groups of four and a total of seven interviews) and survey results (109 responses). The initial thinking period was found to increase argumentation time during discussion, consistent with students' own experiences. However, while students felt that the initial thinking period increased participation and contribution of ideas among all group members, the authors only found significantly improved discussion for two out of three group members, those already most active. The research did not find any statistically significant difference for the least active students with or without the inclusion of the initial thinking period.
A flipped classroom lecture approach was utilised in an engineering mathematics course (118 students). This article reports on student viewing habits based on 104 videos over a period of 12 weeks. The video statistics indicate that many students waited until the last day before assignments to watch the required videos. There are also indications that the students would try to reduce the heavy workload induced by watching all videos on a single day by skipping videos perceived as less valuable. The data show a strong negative correlation between the length of a video and how much of that video the students watched per viewing setting. However, although students watched less of longer videos, the data also indicate that the students still watched, to a large degree, every part of the videos, just not in a single viewing session. Based on these results, recommendations on video creation and flipped classroom implementation are given.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.