2023
DOI: 10.1108/ils-05-2023-0045
|View full text |Cite
|
Sign up to set email alerts
|

Predictive algorithms and racial bias: a qualitative descriptive study on the perceptions of algorithm accuracy in higher education

Stacey Lynn von Winckelmann

Abstract: Purpose This study aims to explore the perception of algorithm accuracy among data professionals in higher education. Design/methodology/approach Social justice theory guided the qualitative descriptive study and emphasized four principles: access, participation, equity and human rights. Data collection included eight online open-ended questionnaires and six semi-structured interviews. Participants included higher education professionals who have worked with predictive algorithm (PA) recommendations programm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 49 publications
0
3
0
Order By: Relevance
“…Von Winckelmann (2023) “Predictive algorithms and racial bias: a qualitative descriptive study on the perceptions of algorithm accuracy in higher education”, Information and Learning Sciences , doi: 10.1108/ILS-05-2023-0045:…”
Section: Summaries Of Included Articlesmentioning
confidence: 99%
See 2 more Smart Citations
“…Von Winckelmann (2023) “Predictive algorithms and racial bias: a qualitative descriptive study on the perceptions of algorithm accuracy in higher education”, Information and Learning Sciences , doi: 10.1108/ILS-05-2023-0045:…”
Section: Summaries Of Included Articlesmentioning
confidence: 99%
“…Noting that predictive algorithms “have become the most common analytic tool used in higher education […] and open a window into the educational lives of students,” Von Winckelmann (2023) catalogs how higher education institutions have used these tools to investigate and improve student success and engagement, support alumni fundraising strategies, identify students likely to default on their student loans and target students whose timeline to graduation is slower than institutionally expected. Like with uses of predictive algorithms in other contexts, they warn that inappropriate uses of predictive algorithms “places students in historically underrepresented groups (HUGs) in a precarious position as there are significant risks of racial biases infiltrating the data.” Von Winckelmann (2023) used a questionnaire and interview protocol informed by data justice theory to investigate how higher education data professionals perceive and vet the accuracy of the algorithms their institutions use. The study confirmed that participants were “aware of both systemic and racial bias in their [predictive algorithm] inputs and outputs and acknowledge their responsibility to use [predictive algorithms] recommendations ethically with students in HUGs.” Among other findings, resulting practical implications from the study recommend that higher education data professionals would be well served by social justice professional education related to data practices.…”
Section: Summaries Of Included Articlesmentioning
confidence: 99%
See 1 more Smart Citation