Background: Recent developments in STEM and computer science education put a strong emphasis on twenty-first-century skills, such as solving authentic problems. These skills typically transcend single disciplines. Thus, problem-solving must be seen as a multidisciplinary challenge, and the corresponding practices and processes need to be described using an integrated framework. Purpose: We present a fine-grained, integrated, and interdisciplinary framework of problem-solving for education in STEM and computer science by cumulatively including ways of problem-solving from all of these domains. Thus, the framework serves as a tool box with a variety of options that are described by steps and processes for students to choose from. The framework can be used to develop competences in problem-solving. Sources of evidence: The framework was developed on the basis of a literature review. We included all prominent ways of domain-specific problem-solving in STEM and computer science, consisting mainly of empirically orientated approaches, such as inquiry in science, and solely theory-orientated approaches, such as proofs in mathematics. Main argument: Since there is an increasing demand for integrated STEM and computer science education when working on natural phenomena and authentic problems, a problem-solving framework exclusively covering the natural sciences or other single domains falls short. Conclusions: Our framework can support both practice and research by providing a common background that relates the ways, steps, processes, and activities of problem-solving in the different domains to one single common reference. In doing so, it can support teachers in explaining the multiple ways in which science problems can be solved and in constructing problems that reflect these numerous ways. STEM and computer science educational research can use the framework to develop competences of problem-solving at a finegrained level, to construct corresponding assessment tools, and to investigate under what conditions learning progressions can be achieved.
Justifications play a central role in argumentation, which is a core topic in school science education. This paper contributes to this field of research by presenting two studies in which we assess students' justifications for supporting or rejecting hypotheses in the physics lab based on self-collected, anomalous experimental data, which are defined as data that contradict a prior belief, hypothesis, or concept. Study 1 analyzes the spectrum of possible justifications students give in semi-structured interviews and categorizes these into ten types: appeal to an authority, data as evidence, experimental competence (technical/skills), experimental competence (self-concept), ignorance, intuition, measurement uncertainties (explicit), measurement uncertainties (implicit), suitability of the experimental setup, and use of theoretical concepts. Study 2 presents a questionnaire suitable for medium-and large-scale assessments that probes students' use of four of these types of justifications: appeal to an authority, data as evidence, intuition, and measurement uncertainties (explicit). The questionnaire can be administered in 5-10 minutes and is designed for students in the eighth and ninth grades. We outline the development and quality of the assessment tools of both studies, reporting on the content validity, factorial validity, discriminant validity, convergent validity, and reliability of the questionnaire. The two studies shed light on the various justifications students use when evaluating anomalous data at a fine-grained level.
We report the findings of an empirical study that investigated whether the source of data-firsthand or secondhand data gained from lab work experiments-has an influence on students' learning outcomes. Results indicate that students' choice of a correct or incorrect hypothesis for a pendulum lab experiment on the influence of the mass of the bob on the time of oscillation does not depend on who the author of the data at hand is: the student themself, a peer, or a teacher. Further, students judge the importance of the data's author as relatively unimportant no matter what data source they have at hand. Thus, it seems fairly unimportant whether students use firsthand or secondhand data when the teaching focus is set on choosing a correct hypothesis in the light of empirical data, as long as students get enough information on how the data were generated and how they are analyzed and interpreted. This result is especially relevant for practitioners, as it shows that it is possible to use secondhand data for the purpose of evaluation and interpretation without significant distortions of epistemic learning processes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.