2019
DOI: 10.1007/s10857-019-09445-0
|View full text |Cite
|
Sign up to set email alerts
|

Classroom observation and mathematics education research

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
28
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 42 publications
(34 citation statements)
references
References 61 publications
0
28
0
Order By: Relevance
“…There remains no clear consensus on a single set of best practices for argument‐based validation methods, while well‐supported approaches and frameworks for validation abound and remain underused in practice. In addition to the lack of validity evidence in peer‐reviewed articles in mathematics education (Bostic, Krupa, et al., 2019; Bostic, Lesseig, et al., 2019; Hill & Shih, 2009), a review of 121 instruments used in projects funded by the National Science Foundation's DRK‐12 program (Minner, Erickson, Wu, & Martinez, 2012; Minner, Martinez, & Freeman, 2012) found that only 67 of those instruments (55%) had any validity evidence available for them. Perhaps the appearance of disagreement in the literature has led to confusion over how to investigate the validity of instruments for their intended purposes, which in turn has left the validity of nearly half of the instruments used in STEM education research unexamined.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…There remains no clear consensus on a single set of best practices for argument‐based validation methods, while well‐supported approaches and frameworks for validation abound and remain underused in practice. In addition to the lack of validity evidence in peer‐reviewed articles in mathematics education (Bostic, Krupa, et al., 2019; Bostic, Lesseig, et al., 2019; Hill & Shih, 2009), a review of 121 instruments used in projects funded by the National Science Foundation's DRK‐12 program (Minner, Erickson, Wu, & Martinez, 2012; Minner, Martinez, & Freeman, 2012) found that only 67 of those instruments (55%) had any validity evidence available for them. Perhaps the appearance of disagreement in the literature has led to confusion over how to investigate the validity of instruments for their intended purposes, which in turn has left the validity of nearly half of the instruments used in STEM education research unexamined.…”
Section: Discussionmentioning
confidence: 99%
“…In another review of mathematics education research, Hill and Shih (2009) found that only 17% of the 47 studies they reviewed provided any validity information, and that most of the information provided was psychometric in nature (e.g., factor analyses). In a review of 114 peerreviewed articles that use classroom observation protocols in mathematics education research, Bostic, Lesseig, Sherman, and Boston (2019) found that only 29% of the manuscripts contained validity evidence that is consistent with the Standards (AERA et al, 2014), which was predominantly evidence related to internal structure in the form of reliability information. Bostic, Lesseig, et al (2019) conducted a second literature review for validity evidence for any of the 27 classroom observation protocols used in the articles they reviewed, and found that only seven (26%) had validity evidence available for more than one of the five sources described in the Standards (AERA et al, 2014).…”
Section: Argument-based Validation Discussed But Not Appliedmentioning
confidence: 99%
See 2 more Smart Citations
“…Many researchers depend on self-reported instructional practices (e.g., Desimone et al 2013;Garet et al 2011), yet even the most well-intentioned teachers tend to overestimate their teaching effectiveness and their uses of evidence-based instructional practices (National Research Council 2012). For instance, Desimone et al (2010) used National Assessment of Education Progress (NAEP) data and found very low correlations between K-12 student and teacher perceptions of classroom activities. The National Center for Educational Statistics (NCES) conducted a careful validity and reliability study comparing data from K-12 observations, self-reported survey data, and self-reported daily teaching logs (US Department of Education, NCES 1999).…”
Section: Classroom Observations and Student Learningmentioning
confidence: 99%