2016
DOI: 10.1016/j.robot.2016.08.018
|View full text |Cite
|
Sign up to set email alerts
|

Ethics of healthcare robotics: Towards responsible research and innovation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
120
0
5

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 180 publications
(126 citation statements)
references
References 31 publications
1
120
0
5
Order By: Relevance
“…Autonomous systems are characterised as systems capable of making decisions independently of human interference (Brodsky, 2016;Collingwood, 2017), but, unlike mere automation, they can make these decisions while facing uncertainty (Danks & London, 2017). Autonomous systems have been developed in different domains, including warfare, personal care (Arkin, 2013;Pineau, Montemerlo, Pollack, Roy, & Thrun, 2003;Stahl & Coeckelbergh, 2016;Sukman, 2015) and transport. AVs rely on artificial intelligence (AI), sensors and big data to analyse information, adapt to changing circumstances and handle complex situations as a substitute for human judgement, as the latter would no longer be needed for conventional vehicle operations such as lane-changing, parking, collision avoidance and braking (Long, Hanford, Janrathitikarn, Sinsley, & Miller, 2007;West, 2016).…”
Section: Background To Avsmentioning
confidence: 99%
“…Autonomous systems are characterised as systems capable of making decisions independently of human interference (Brodsky, 2016;Collingwood, 2017), but, unlike mere automation, they can make these decisions while facing uncertainty (Danks & London, 2017). Autonomous systems have been developed in different domains, including warfare, personal care (Arkin, 2013;Pineau, Montemerlo, Pollack, Roy, & Thrun, 2003;Stahl & Coeckelbergh, 2016;Sukman, 2015) and transport. AVs rely on artificial intelligence (AI), sensors and big data to analyse information, adapt to changing circumstances and handle complex situations as a substitute for human judgement, as the latter would no longer be needed for conventional vehicle operations such as lane-changing, parking, collision avoidance and braking (Long, Hanford, Janrathitikarn, Sinsley, & Miller, 2007;West, 2016).…”
Section: Background To Avsmentioning
confidence: 99%
“…Technoethical inquiry into social robots encourages thinking about how we can theorise the moral standing of non-humans (Gunkel, 2017), aids the critical integration of affective elements into robots (Stahl et al, 2014), enriched by the feminist-inspired, contextually-oriented ethics of care (Johansson, 2013;Van Wyberghe, 2016. TR also feeds into responsible research and innovation practices: social robots in caring contexts, like carebots for the elderly, require negotiated ethical deliberation from all stakeholders on their appropriate form, function, role and relationship capabilities if they are to benefit all parties rather than diminish social flourishing (Stahl & Coeckelbergh 2016;Stahl et al, 2014;Van Wynsberghe, 2016.…”
Section: Background: Technoethics Robotics Social Robots and A Potentimentioning
confidence: 99%
“…Hence, sexbots could be customised to be vulnerable to human mistreatment (Mackenzie, 2014). This paper on sexbots falls within the roboethics strand of technoethics inquiring into artificial moral agents (DeBaets, 2014;Pana, 2012;Sullins, 2009;Wareham, 2013), robotic moral personhood and rights (Allen & Wallach, 2014;Coeckelbergh 2010;Gerdes, 2015;Yampolskiy, 2012), whether specific ethical theories or critical ethical faculties should be operationalized in robots (Abney, 2014;Bringsjord, 2017;Bringsjord & Taylor, 2014;Hughes, 2014;Majot & Yampolskiy, 2014), ethical design (Stahl et al, 2014;Van Wynsberghe 2016, and the optimal roles of specific social robots like carebots (Coeckelbergh, 2012;Stahl & Coeckelbergh, 2016;Van Wysberghe 2013. Social robots are machines placed in situations requiring ethical decisions from robots, designers and users, raising crucial technoethical issues over how to ensure mutually beneficial AI.…”
Section: Background: Technoethics Robotics Social Robots and A Potentimentioning
confidence: 99%
See 1 more Smart Citation
“…This raises significant fears about human robot interactions, the potential for reduced quality of care and the alienation of the ageing person to an inhuman periphery 33. Against such concerns, Stahl and Coeckelbergh emphasise a positive role for robot caregivers and suggest that critics ignore novel and positive modes of interaction between robots and ageing persons 34. Robots and other emerging technologies pose significant questions for the meaning in the lives of persons qua ageing persons.…”
Section: Introductionmentioning
confidence: 99%