Science labs should promote reasoning that resembles the work that scientists do. However, this is often not the case. We present a lab in which students strive to find out which of two models best describes a physics experiment. The quantification of measurement uncertainties—another topic that is often neglected in high school curricula—determines the quality of the data, and will play a central role in the final decision between these models and is a vital skill in the modern world.
Interpreting experimental data in high school experiments can be a difficult task for students, especially when there is large variation in the data. At the same time, calculating the standard deviation poses a challenge for students. In this article, we look at alternative uncertainty measures to describe the variation in data sets. A comparison is done in terms of mathematical complexity and statistical quality. The determination of mathematical complexity is based on different mathematics curricula. The statistical quality is determined using a Monte Carlo simulation in which these uncertainty measures are compared to the standard deviation. Results indicate that an increase in complexity goes hand in hand with quality. Additionally, we propose a sequence of these uncertainty measures with increasing mathematical complexity and increasing quality. As such, this work provides a theoretical background to implement uncertainty measures suitable for different educational levels.
What are the structural characteristics of written scientific explanations that make them good? This is often difficult to measure. One approach to describing and analyzing structures is to employ network theory. With this research, we aim to describe the elementary structure of written explanations, their qualities, and the differences between those made by experts and students. We do this by converting written explanations into networks called element maps and measure their characteristics: size, the ratio of diameter to size, and betweenness centrality. Our results indicate that experts give longer explanations with more intertwinement, organized around a few central key elements. Students’ explanations vary widely in size, are less intertwined, and often lack a focus around key elements. We have successfully identified and quantified the characteristics that can be a starting point for guiding students towards generating expert-like written explanations.
In this lab activity, carbon copy paper is used to record the horizontal distance a marble flies off a table after rolling down an incline. The minimal scatter of the dots visually shows the high precision—i.e. the small uncertainty—of the measurements to students. The theoretical prediction of this distance will be too big if students forget to include rotational energy in the energy balance when they calculate the marble’s speed at the bottom of the incline. This results in a discrepancy between the predicted horizontal distance and the measurement result. The precision of the experiment and the absence of overlap with the theoretical prediction is evidence that the prediction has to be wrong. Including rotational energy and taking a 10% energy loss due to friction into account, makes the measurement result overlap with the theoretical prediction, bringing them into agreement. Thus, measurement uncertainties guide the process of comparing the measurement result with the prediction: overlap between the theory-based prediction and the measurement result indicates agreement, whereas no overlap implies discrepancy. The lab activity presented here is an activity where measurement uncertainties are used in a meaningful, indispensable manner. The experimental result is evidence that forces students to rethink their assumptions, in this case about the conservation of energy. This leads to the revision of their calculation, emphasizing the necessity to include rotational energy and friction. Without it, the highly precise measurement result is in disagreement with the theoretical prediction. A procedure such as this—comparing empirical data with theory—is an authentic and common practice in science and should thus find its way into the physics classroom; but it cannot be done without an analysis of measurement uncertainties.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.