O ver the last decade, artificial intelligence (AI) has become ubiquitous; it is in our smartphones and homes (Siri and Alexa), our emails (spam filters), Netflix recommendations, and ride-share apps. On a societal level, it is used for predictive policing and credit lending; and in medicine, it promises to transform the shape and scope of health care.But for its seemingly endless benefits, AI comes with ethical challenges and potential harm. 1 Most AI algorithms are supervised, meaning training inputs and outputs are defined by humans. This makes it susceptible to the same socioeconomic, racial, and gender biases that shape our world. Unintended bias can occur at any level: training, algorithm design, and implementation. There are many examples of how AI software, from facial recognition to natural language processing, have displayed encoded racism, noticed only after commercial deployment. 2 The explosion in health care AI research, specifically machine learning (ML) and deep learning (DL) have definite clinical relevance in ophthalmology. [3][4] In 2018, IDx-DR received approval from the US Food and Drug Administration (FDA) for its autonomous AI for diabetic retinopathy (DR) screening. EyeArt's approval followed shortly. Both set impressively high standards for the validation of AI clinical support decision tools. However, as AI in ophthalmology becomes increasingly commercially available, we will have to contend with the same issues of unintended bias as other industries. Identifying and guarding against these consequences begins with adequate transparency.To highlight theoretical concepts related to bias in ophthalmologic AI, we examine 2 use cases: DR screening using fundus photos and AI utilizing optical coherence tomography (OCT), a burgeoning area of research. In doing so, we raise questions regarding bias and propose the adoption of a reporting tool, model cards, to promote standardization and transparency.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.