2010
DOI: 10.1080/01619561003685379
|View full text |Cite
|
Sign up to set email alerts
|

Learning to Learn From Data: Benchmarks and Instructional Communities

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
97
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 70 publications
(98 citation statements)
references
References 17 publications
1
97
0
Order By: Relevance
“…2 In 1 or 2 days between the 5th and 6th weeks, teachers were expected to analyze data, develop, and implement an instructional response and test for mastery. As the two articles about data use at the school and individual level suggest, there was extensive variation in how these steps were enacted across schools (see Blanc, Christman, Liu, Mitchell, & Travers, 2010/this issue, and Nabors Oláh, Lawrence, & Riggan, 2010/this issue). The Philadelphia Benchmarks were consistent with Perie and colleagues' (2007) definition of an interim assessment, in that they "(1) evaluate students' knowledge and skills relative to a specific set of academic goals, typically within a limited time frame, and (2) are designed to inform decisions at both the classroom and beyond the classroom level" (p. 4).…”
Section: The Philadelphia Elementary and Middle Grades Benchmark Testsmentioning
confidence: 95%
“…2 In 1 or 2 days between the 5th and 6th weeks, teachers were expected to analyze data, develop, and implement an instructional response and test for mastery. As the two articles about data use at the school and individual level suggest, there was extensive variation in how these steps were enacted across schools (see Blanc, Christman, Liu, Mitchell, & Travers, 2010/this issue, and Nabors Oláh, Lawrence, & Riggan, 2010/this issue). The Philadelphia Benchmarks were consistent with Perie and colleagues' (2007) definition of an interim assessment, in that they "(1) evaluate students' knowledge and skills relative to a specific set of academic goals, typically within a limited time frame, and (2) are designed to inform decisions at both the classroom and beyond the classroom level" (p. 4).…”
Section: The Philadelphia Elementary and Middle Grades Benchmark Testsmentioning
confidence: 95%
“…Such studies focused on how teachers engaged with data, their assessment or data literacy, skills in analysing and interpreting data to draw conclusions about student learning and identifying the next steps for instruction (e.g., Oláh, Lawrence & Riggan, 2010;Hoover & Abrams, 2013;Blanc et al, 2010;Shepard, Davidson & Bowman, 2011;Abrams, McMillan & Wetzel, 2015;Wohlstetter, Datnow & Park, 2008). Collectively, these studies found that teachers interpret assessment data from a macro-classroom level rather than a micro-level focused on individual student learning and misconceptions.…”
Section: Theoretical Frameworkmentioning
confidence: 99%
“…To use a student monitoring system (SMS) and translate resulting data into planned and implemented instructional activities is, in many cases, new to school staff. Blanc et al (2010) highlighted DBDM knowledge and skills that are essential for making DBDM work: knowledge and skills for evaluating student progress (e.g., how to use a student monitoring system and interpret the collected data) and for setting SMART (specific, measurable, attainable, realistic, and time-bound) performance goals. The DBDM knowledge and skills already available within school teams at the start of the intervention may therefore influence the success of a DBDM intervention.…”
Section: Knowledge Skills and Attitudementioning
confidence: 99%