Responsibility for Killer Robots
Future weapons will make life-or-death decisions without a human in the loop. When such weapons inflict unwarranted harm, no one appears to be responsible. There seems to be a responsibility gap. I first reconstruct the argument for such responsibility gaps to then argue that this argument is not so...
Main Author: | |
---|---|
Format: | Electronic Article |
Language: | English |
Check availability: | HBZ Gateway |
Journals Online & Print: | |
Fernleihe: | Fernleihe für die Fachinformationsdienste |
Published: |
Springer Science + Business Media B. V
[2019]
|
In: |
Ethical theory and moral practice
Year: 2019, Volume: 22, Issue: 3, Pages: 731-747 |
RelBib Classification: | NCD Political ethics NCJ Ethics of science VA Philosophy |
Further subjects: | B
Hierarchical groups
B Moral Philosophy B Artificial Intelligence B Causation B Moral Responsibility B Responsibility gap |
Online Access: |
Volltext (Resolving-System) |
Summary: | Future weapons will make life-or-death decisions without a human in the loop. When such weapons inflict unwarranted harm, no one appears to be responsible. There seems to be a responsibility gap. I first reconstruct the argument for such responsibility gaps to then argue that this argument is not sound. The argument assumes that commanders have no control over whether autonomous weapons inflict harm. I argue against this assumption. Although this investigation concerns a specific case of autonomous weapons systems, I take steps towards vindicating the more general idea that superiors can be morally responsible in virtue of being in command. |
---|---|
ISSN: | 1572-8447 |
Contains: | Enthalten in: Ethical theory and moral practice
|
Persistent identifiers: | DOI: 10.1007/s10677-019-10007-9 |