Human soldiers can revolt against their orders, but human soldiers can also decide to commit atrocities beyond their orders. Many of the atrocities of war are specifically human behaviors. A drone may bomb you or shoot you — very effectively — but it is not going to decide to torture you out of boredom, rape you in front of your kids, or cut off your ears for trophies. Some of the worst atrocities of recent wars — Vietnam, Bosnia, Iraq — have been things that a killer robot simply isn’t going to do outside of anthropomorphized science-fantasy fiction.
The orders given to an autonomous drone, and all of the major steps of its decision-making, can be logged and retained indefinitely. Rather than advocating against autonomous drone warfare, it would be better to advocate for accountable drone warfare.
That is indeed a fair point, but I think it is not so important when talking about a tyrant gaining control of his own country. Because the soldiers in Iraq, Bosnia etc. saw the people they tortured (or similar) not as people, but as “the Enemy”.
That kind of thing is much harder to achieve when they are supposed to be fighting their own countrymen.
I agree that the killer robots on the horizon won’t have a will to commit atrocities (though I’m not sure what an AGI killer robot might do), however, I must note that this is a tangent.
The meaning of the term “atrocity” in my statement was more to indicate things like genocide and oppression. I was basically saying “humans are capable of revolting in the event that a tyrant wants to gain power whereas robots are not”.
I think I’ll replace the word atrocities for clarity.
(Trigger warning for atrocities of war.)
Human soldiers can revolt against their orders, but human soldiers can also decide to commit atrocities beyond their orders. Many of the atrocities of war are specifically human behaviors. A drone may bomb you or shoot you — very effectively — but it is not going to decide to torture you out of boredom, rape you in front of your kids, or cut off your ears for trophies. Some of the worst atrocities of recent wars — Vietnam, Bosnia, Iraq — have been things that a killer robot simply isn’t going to do outside of anthropomorphized science-fantasy fiction.
The orders given to an autonomous drone, and all of the major steps of its decision-making, can be logged and retained indefinitely. Rather than advocating against autonomous drone warfare, it would be better to advocate for accountable drone warfare.
That is indeed a fair point, but I think it is not so important when talking about a tyrant gaining control of his own country. Because the soldiers in Iraq, Bosnia etc. saw the people they tortured (or similar) not as people, but as “the Enemy”. That kind of thing is much harder to achieve when they are supposed to be fighting their own countrymen.
I agree that the killer robots on the horizon won’t have a will to commit atrocities (though I’m not sure what an AGI killer robot might do), however, I must note that this is a tangent.
The meaning of the term “atrocity” in my statement was more to indicate things like genocide and oppression. I was basically saying “humans are capable of revolting in the event that a tyrant wants to gain power whereas robots are not”.
I think I’ll replace the word atrocities for clarity.