In recent years, the ethical impact of AI has been increasingly scrutinized, with public scandals emerging over biased outcomes, lack of transparency, and the misuse of data. This has led to a growing mistrust of AI and increased calls for mandated ethical audits of algorithms. Current proposals for ethical assessment of algorithms are either too high level to be put into practice without further guidance, or they focus on very specific and technical notions of fairness or transparency that do not consider multiple stakeholders or the broader social context. In this article, we present an auditing framework to guide the ethical assessment of an algorithm. The audit instrument itself is comprised of three elements: a list of possible interests of stakeholders affected by the algorithm, an assessment of metrics that describe key ethically salient features of the algorithm, and a relevancy matrix that connects the assessed metrics to stakeholder interests. The proposed audit instrument yields an ethical evaluation of an algorithm that could be used by regulators and others interested in doing due diligence, while paying careful attention to the complex societal context within which the algorithm is deployed.
The “problem of memory” in epistemology is concerned with whether and how we could have knowledge, or at least justification, for trusting our apparent memories. I defend an inductive solution – more precisely, an abductive solution – to the problem. A natural worry is that any such solution would be circular, for it would have to depend on memory. I argue that belief in the reliability of memory can be justified from the armchair, without relying on memory. The justification is, roughly, that my having the sort of experience that my apparent memory should lead me to expect is best explained by the hypothesis that my memories are reliable. My solution is inspired by Harrod's (1942) inductive solution. Coburn (1960) argued that Harrod's solution contains a fatal flaw. I show that my solution is not vulnerable to Coburn's objection, and respond to a number of other, recent and likely objections.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.