As of 2020, the Public Employment Service Austria (AMS) makes use of algorithmic profiling of job seekers to increase the efficiency of its counseling process and the effectiveness of active labor market programs. Based on a statistical model of job seekers' prospects on the labor market, the system—that has become known as the AMS algorithm—is designed to classify clients of the AMS into three categories: those with high chances to find a job within half a year, those with mediocre prospects on the job market, and those clients with a bad outlook of employment in the next 2 years. Depending on the category a particular job seeker is classified under, they will be offered differing support in (re)entering the labor market. Based in science and technology studies, critical data studies and research on fairness, accountability and transparency of algorithmic systems, this paper examines the inherent politics of the AMS algorithm. An in-depth analysis of relevant technical documentation and policy documents investigates crucial conceptual, technical, and social implications of the system. The analysis shows how the design of the algorithm is influenced by technical affordances, but also by social values, norms, and goals. A discussion of the tensions, challenges and possible biases that the system entails calls into question the objectivity and neutrality of data claims and of high hopes pinned on evidence-based decision-making. In this way, the paper sheds light on the coproduction of (semi)automated managerial practices in employment agencies and the framing of unemployment under austerity politics.
Algorithmic transparency presents a significant challenge to system developers and users of algorithmic systems alike. Framing the problem as a 'wicked' one, this study tackles the issue of transparency in the EnerCoach energy accounting tool through presenting a situated ethnography of the algorithmic system and exploring the issues and challenges of model transparency and post-hoc explainability therein. By engaging stakeholders through participatory design methodologies, both a conceptual understanding of the problem and material solutions thereof are developed and evaluated. The findings show the promising potential of participatory design methodologies to elevate users to a 'critical audience', and the solutions co-created by the study participants for the challenge of algorithmic transparency. The results also highlight the complexity of the problem: transparency of algorithmic systems must be understood as a multi-facetted and highly contextual, 'wicked' problem that requires diverse methodological interventions to reach 'satisficing' solutions.
CCS CONCEPTS• Human-centered computing → Participatory design; Empirical studies in interaction design; Ethnographic studies; • Social and professional topics → Sustainability.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.