Abstract:Artificial Intelligence (AI) systems are increasingly dependent on machine learning models which lack interpretability and algorithmic transparency, and hence may not be trusted by its users. The fear of failure in these systems is driving many governments to demand more explanation and accountability. Take, for example, the "Right of Explanation" rule proposed in the European Union in 2019, which gives citizens the right to demand an explanation from AI-based predictions. Explainable Artificial Intelligence (… Show more
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.