The proliferation of biometric systems in our societies is shaping public debates around its political, social and ethical implications. Yet, whilst concerns towards the racialised use of this technology have been on the rise, the field of biometrics remains unperturbed by these debates. Despite the lack of critical analysis, algorithmic fairness has recently been adopted by biometrics. Different studies have been published to understand and mitigate demographic bias in biometric systems, without analysing the political consequences. In this paper, we offer a critical reading of recent debates about biometric fairness and show its detachment from political debates. Building on previous fairness demonstrations, we prove that biometrics will be always biased. Yet, we claim algorithmic fairness cannot distribute justice in scenarios which are broken or whose intended purpose is to discriminate. By focusing on demographic biases rather than examine how these systems reproduce historical and political injustices, fairness has overshadowed the elephant in the room of biometrics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.