Robotic and artificially intelligent (AI) systems are becoming prevalent in our day-to-day lives. As human interaction is increasingly replaced by human–computer and human–robot interaction (HCI and HRI), we occasionally speak and act as though we are blaming or praising various technological devices. While such responses may arise naturally, they are still unusual. Indeed, for some authors, it is the programmers or users—and not the system itself—that we properly hold responsible in these cases. Furthermore, some argue that since directing blame or praise at technology itself is unfitting, designing systems in ways that encourage such practices can only exacerbate the problem. On the other hand, there may be good moral reasons to continue engaging in our natural practices, even in cases involving AI systems or robots. In particular, daily interactions with technology may stand to impact the development of our moral practices in human-to-human interactions. In this paper, we put forward an empirically grounded argument in favor of some technologies being designed for social responsiveness. Although our usual practices will likely undergo adjustments in response to innovative technologies, some systems which we encounter can be designed to accommodate our natural moral responses. In short, fostering HCI and HRI that sustains and promotes our natural moral practices calls for a co-developmental process with some AI and robotic technologies.
The introduction of Autonomous Military Systems (AMS) onto contemporary battlefields raises concerns that they will bring with them the possibility of a techno-responsibility gap, leaving insecurity about how to attribute responsibility in scenarios involving these systems. In this work I approach this problem in the domain of applied ethics with foundational conceptual work on autonomy and responsibility. I argue that concerns over the use of AMS can be assuaged by recognising the richly interrelated context in which these systems will most likely be deployed. This will allow us to move beyond the solely individualist understandings of responsibility at work in most treatments of these cases, toward one that includes collective responsibility. This allows us to attribute collective responsibility to the collectives of which the AMS form a part, and to account for the distribution of burdens that follows from this attribution. I argue that this expansion of our responsibility practices will close at least some otherwise intractable techno-responsibility gaps.
ZusammenfassungWearables unterstützen ihre Nutzer:innen in unterschiedlichen Kontexten. Dabei erzeugen und nutzen sie eine Vielzahl von oft sehr persönlichen (Gesundheits-)Daten, ohne dass Nutzer:innen über die notwendigen Kenntnisse und Erfahrungen verfügen, um reflektierte Entscheidungen über die Nutzung dieser Daten treffen zu können. In der aktuellen Forschung fehlen Konzepte, die einen unreflektierten Datenaustausch vermeiden und reflektierte Entscheidungen unterstützen. In diesem Beitrag diskutieren wir gesellschaftliche Herausforderungen der digitalen Souveränität und zeigen mögliche Wege der Visualisierung persönlicher (Gesundheits-)Daten und der Interaktion mit einem System, das transparente Informationen über die Nutzung von Wearable-Daten liefert. Wir zeigen Möglichkeiten zur Visualisierung rechtlicher und datenschutzrechtlicher Informationen auf und diskutieren unsere Ideen für einen erlebbaren Datenschutz mit Gamifizierungskonzepten. Die Bereitstellung interaktiver und visueller Datenräume kann die Fähigkeit zur eigenständigen Selbstbestimmung für Datenpreisgaben stärken.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.