2021
DOI: 10.1080/10627197.2021.1971966
|View full text |Cite
|
Sign up to set email alerts
|

A Methodology for Determining and Validating Latent Factor Dimensionality of Complex Multi-Factor Science Constructs Measuring Knowledge-In-Use

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 18 publications
0
5
0
Order By: Relevance
“…For our purpose, we recast response process validity as the extent to which the score levels assigned to children on the COR-Adv1.5 items reflect their level of proficiency on the construct being measured. Following Kaldaras et al (2021), we assessed this feature by examining the average factor scores against the observed (raw) score levels for each item. If the item score levels are well delineated and the scoring rubric is both properly defined in the test instrument and properly interpreted by the observers, the corresponding average factor score should be consistently higher as one progresses along the score levels for each item, and the separation between each raw score should be evident.…”
Section: Evidence Based On Response Processmentioning
confidence: 99%
“…For our purpose, we recast response process validity as the extent to which the score levels assigned to children on the COR-Adv1.5 items reflect their level of proficiency on the construct being measured. Following Kaldaras et al (2021), we assessed this feature by examining the average factor scores against the observed (raw) score levels for each item. If the item score levels are well delineated and the scoring rubric is both properly defined in the test instrument and properly interpreted by the observers, the corresponding average factor score should be consistently higher as one progresses along the score levels for each item, and the separation between each raw score should be evident.…”
Section: Evidence Based On Response Processmentioning
confidence: 99%
“…In terms of the type of teaching methods, hands-on projects are more meaningful and interesting for female students (Mitchell, 1993;Halpern, 2004;Geist and King, 2008). Project-based learning (Blumenfeld et al, 2000) has proven to increase students' engagement as well as a deeper understanding of scientific problems (Kaldaras et al, 2021). Instruction based on memorizing without understanding has become obsolete.…”
Section: Motivation: Curriculum Perception Perspectivementioning
confidence: 99%
“…The student responses to the items were scored by a trained group of coders using holistic rubrics. The scores were then used to conduct measurement invariance (Kaldaras et al, 2021b) and item response theory analysis as part of the validation study which provided strong evidence for the validity of both LPs (Kaldaras, 2020;Kaldaras et al, 2021a).…”
Section: Study Settingmentioning
confidence: 99%