We recently developed a multiple-choice conceptual survey in mechanical waves. The development, evaluation, and demonstration of the use of the survey were reported elsewhere [ A. Tongchai et al. Int. J. Sci. Educ. 31 2437 (2009)]. We administered the survey to 902 students from seven different groups ranging from high school to second year university. As an outcome of that analysis we were able to identify several conceptual models which the students seemed to be using when answering the questions in the survey. In this paper we attempt to investigate the strength with which the students were committed to these conceptual models, as evidenced by the consistency with which they answered the questions. For this purpose we focus on the patterns of student responses to questions in one particular subtopic, wave propagation. This study has three main purposes: (1) to investigate the consistency of student conceptions, (2) to explore the relative usefulness of different analysis techniques, and (3) to determine what extra information a study of consistency can give about student understanding of basic concepts. We used two techniques: first, categorizing and counting, which is widely used in the science education community, and second, model analysis, recently introduced into physics education research. The manner in which categorizing and counting is used is very diverse while model analysis has been employed only in prescriptive ways. Research studies have reported that students often use their conceptual models inconsistently when solving a series of questions that test the same idea. Our results support their conclusions. Moreover, our findings suggest that students who have had more experiences in physics learning seem to use the scientifically accepted models more consistently. Further, the two analysis techniques have different advantages and disadvantages. Our findings show that model analysis can be used in more diverse ways, provides flexibility in analyzing multiple-choice questions, and provides more information about consistency and inconsistency of student conceptions. An unexpected finding is that studying waves in other contexts (for example, quantum mechanics or electromagnetism) leads to more consistent answers about mechanical waves. The suggestion is that studying more abstract topics may solidify students’ understanding of more concrete waves. While this might be considered to be intuitive, we have not actually found direct empirical studies supporting this conjecture
This article investigates the optics misconceptions of 220 year 11 Thai high-school students. These misconceptions became apparent when the students attempted to explain how an object submerged in a water tank is 'seen' by an observer looking into the tank from above and at an angle. The two diagnostic questions used in the study probe the students' ability to use a ray diagram to explain the relationship between object, image and observer, and then to use the ray diagram to qualitatively determine the position of the image. The study indicates that these high-school students, even after instruction, had significant misconceptions about the direction of propagation of light, how light refracts at an interface, and how to determine the position of an image. The study revealed that students used various concept models to explain how the object can be 'seen' in this situation. Only 22% of all students had a qualitative understanding of how to use a ray diagram to determine image position, and only 1 of 220 students could identify the correct image position using correct reasoning. Our results indicate that students require very careful instruction if they are to understand how objects are 'seen' and how images are formed when light refracts through a planar surface.
As universities attempt to integrate active learning into their lectures, a range of strategies is emerging. Amongst the strategies is pre-prepared worksheets which students work through, facilitated by the lecturer. Despite the fact that worksheets have not yet been the subject of much research activity, there are instances of their use. Once such instance is by a pair of physics lecturers at Mahidol University, Thailand. The worksheets, called guided worksheets as they provide structure for students to take notes as the content in the lectures progresses, are prepared by the lecturers and have been in use since 2004. Evaluations showed that the guided worksheets met their intent but there were issues around certain topics which students found challenging. Concerted effort lead to the development of research based specialized guided worksheets for those topics that had issues. These specialized guided worksheets requiring substantially more interactions and student problem solving in line with active learning strategies, have been in use since 2012. This paper aims to describe the design of the specialized guided worksheets for the topic of electric field, and its evaluation. Pre- and post-tests were implemented over 2 years. The first was with guided worksheets with 260 students in 2011, and the second included specialized guided worksheets with 163 students in 2012. Gains on student understanding were higher in 2012 and students who were interviewed indicated that they found the specialized guided worksheets helpful for learning. The results indicate that the specialized guided worksheets made a difference in topics that students find challenging.
This study investigated the multiple-choice test of understanding of vectors (TUV), by applying item response theory (IRT). The difficulty, discriminatory, and guessing parameters of the TUV items were fit with the three-parameter logistic model of IRT, using the PARSCALE program. The TUV ability is an ability parameter, here estimated assuming unidimensionality and local independence. Moreover, all distractors of the TUV were analyzed from item response curves (IRC) that represent simplified IRT. Data were gathered on 2392 science and engineering freshmen, from three universities in Thailand. The results revealed IRT analysis to be useful in assessing the test since its item parameters are independent of the ability parameters. The IRT framework reveals item-level information, and indicates appropriate ability ranges for the test. Moreover, the IRC analysis can be used to assess the effectiveness of the test's distractors. Both IRT and IRC approaches reveal test characteristics beyond those revealed by the classical analysis methods of tests. Test developers can apply these methods to diagnose and evaluate the features of items at various ability levels of test takers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.