Performance assessments (PAs) offer a more authentic measure of higher order skills, which is ideal for competency‐based education (CBE) especially for students already in the workplace and striving to advance their careers. The goal of the current study was to examine the validity of undergraduate PA score interpretation in the college of IT at a CBE online, higher education institute by evaluating (a) the transparency of cognitive complexity or demands of the task as communicated through the task prompt versus expected cognitive complexity based on its associated rubric aspect and (b) the impact of cognitive complexity on task difficulty. We found that there is a discrepancy in the communicated versus expected cognitive complexity of PA tasks (i.e., prompt vs. rubric) where rubric complexity is higher, on average, than task prompt complexity. This discrepancy negatively impacts reliability but does not affect the difficulty of PA tasks. Moreover, the cognitive complexity of both the task prompt and the rubric aspect significantly impacts the difficulty of PA tasks based on Bloom's taxonomy but not Webb's DOK, and this effect is slightly stronger for the rubric aspect than the task prompt. Discussion centers on how these findings can be used to better inform and improve PA task writing and review procedures for assessment developers as well as customize PAs (their difficulty levels) to different course levels or individual students to improve learning.
Assessments, even summative, are part of the learning journey not just for the student, but for the instructors as well. In competency-based education, performance assessments (PAs) are regarded as a highly authentic method of measurement, but their complexity makes them vulnerable to construct-irrelevant variance such as group bias. A differential item functioning (DIF) study was conducted to detect potential bias in a series of information technology PA tasks in which task scenarios were identified (by SMEs) as neutral or potentially controversial, where the latter was hypothesized as more likely to trigger DIF for certain demographics. Given the variety of DIF methods available and their relative strengths and weaknesses, three common statistical methods – Mantel-Haenszel (MH), logistic regression (LR), and Lord's chi-square (LC) - were used followed by a substantive review of DIF items. Hypotheses were largely supported by the analysis and review. Discussion centers on the implication of findings for assessment strategies in education.
Diversity, equity, and inclusion (DEI) has become a major focus in higher education in recent years. With increasing attention on equity and fairness, institutions of higher learning are implementing approaches to reviewing and revising curricula and assessments to ensure all students are equally prepared to succeed in their programs of study. The purpose of this chapter is to describe a mixed- methods exploratory approach to identify sources of inequities in a selection of objective and performance assessments as part of a larger project focused on reviewing a selection of courses for issues related to (DEI) attainment. Procedures and results are presented for assessments in four courses across four disciplines (general education, information technology, business, and health professions) along with recommended solutions to even the playing field for all students. A discussion of limitations and next steps to create access to equitable education and assessment opportunities closes this chapter.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.