2020 Physics Education Research Conference Proceedings 2020
DOI: 10.1119/perc.2020.pr.jambuge
|View full text |Cite
|
Sign up to set email alerts
|

Assessment feedback: A tool to promote scientific practices in upper-division

Abstract: Students' scores on assessments play a vital role in course modifications, though their effectiveness relies on the quality of the interpretation of these scores. We adapt the notion of assessments as a change agent so that a well-developed rubric accompanied by intentionally designed instructor feedback can act as a tool to inform course improvement. In conjunction with work developing a standardized upper-division thermal physics assessment, this pilot work articulates a methodology to determine feedback for… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…This work also informs our on-going work of developing a new standardized assessment for upper-division thermal physics -The Thermal and Statistical Physics Assessment (TaSPA). In particular, the work presented in this paper informs the assessment tasks and the associated feedback for instructors in-development based on students' responses to these tasks [23].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…This work also informs our on-going work of developing a new standardized assessment for upper-division thermal physics -The Thermal and Statistical Physics Assessment (TaSPA). In particular, the work presented in this paper informs the assessment tasks and the associated feedback for instructors in-development based on students' responses to these tasks [23].…”
Section: Discussionmentioning
confidence: 99%
“…In this paper, we build on work by Harris et al, and Stephenson et al to design assessment tasks that address scientific practices by leveraging principles of Evidence-Centered Design and to validate them for their potential to elicit expected evidence [13][14][15][16][17][18][19]. The existing work on designing assessment tasks to assess scientific practices using Evidence-Centered Design covers middle school science students [10,11,13,14,20,21], introductory-level chemistry students [16], and upper-division physics students [22,23]. We fill the gap in the literature by introducing a theory-driven methodology adopting Evidence-Centered Design to assess scientific practices in physics paper-based assessments at introductory-level.…”
Section: Introductionmentioning
confidence: 99%
“…TaSPA was developed with specific attention to how faculty use assessment results to inform changes to their instruction based on theories of self-regulated learning and evidence-centered design (ECD) [2,[5][6][7]. TaSPA evaluates students with respect to their achievement of a learning goal categorically as: "Met"(M), "Partially Met"(P), or "Not Met"(N); referred to together as MPN categories.…”
Section: Background and Motivationmentioning
confidence: 99%
“…II). Ideally, this allows faculty to more clearly interpret their students' performance [2]. This solution, while potentially allowing us to report results of RBAs in a more useful way, has its own unique challenges; when rating individual students into categorical bins (instead of a continuous score category), how do we generate feedback that meaningfully represents the course as a whole based on a distribution of categorical variables (i.e., how many students achieved, partially achieved, or did not achieve the learning goal)?…”
Section: Introductionmentioning
confidence: 99%