I was thinking mainly along the lines of using it in regular combat vs. indiscriminately killing protesters. Autonomous weapons should eventually be better than humans at (a) hitting targets, thus reducing combatant casualties on the side that uses them and (b) differentiating between combatants and non-combatants, thus reducing civilian casualties. This is working under the assumption, that something like a guard robot would accompany a patrolling squad. Something like a swarm of small drones, that sweep a city to find and subdue all combatants is of course a different matter.
The US is already using it’s drones in Pakistan in a way that violates many passages of international law, like shooting at people who rescue wounded people.
I wasn’t aware of this, do you have a source on that? Regardless, the number of civilian casualties from drone strikes is definitely too high, from what I know.
I was thinking mainly along the lines of using it in regular combat
US drones in Pakistan usually don’t strike in regular combat but strike a house while people sleep in it.
indiscriminately killing protesters
If you want to kill protesters you don’t need drones. You can simply shoot into the mass. In most cases that however doesn’t make sense and is no effective move.
If you want to understand warfare you have to move past the standard spin.
I wasn’t aware of this, do you have a source on that?
Regardless, the number of civilian casualties from drone strikes is definitely too high, from what I know.
The fact that civilian casualties exists doesn’t show that a military violates ethical standards. Shooting on rescues on the other hand is a violation of ethical standards.
From a military standpoint there’s an advantage to be gained by killing the doctors of the other side, from an ethical perspective it’s bad and there’s international law against it.
The US tries to maximize military objectives instead of ethical ones.
I was thinking mainly along the lines of using it in regular combat vs. indiscriminately killing protesters.
Autonomous weapons should eventually be better than humans at (a) hitting targets, thus reducing combatant casualties on the side that uses them and (b) differentiating between combatants and non-combatants, thus reducing civilian casualties. This is working under the assumption, that something like a guard robot would accompany a patrolling squad. Something like a swarm of small drones, that sweep a city to find and subdue all combatants is of course a different matter.
I wasn’t aware of this, do you have a source on that? Regardless, the number of civilian casualties from drone strikes is definitely too high, from what I know.
US drones in Pakistan usually don’t strike in regular combat but strike a house while people sleep in it.
If you want to kill protesters you don’t need drones. You can simply shoot into the mass. In most cases that however doesn’t make sense and is no effective move.
If you want to understand warfare you have to move past the standard spin.
http://www.theguardian.com/commentisfree/2012/aug/20/us-drones-strikes-target-rescuers-pakistan
The fact that civilian casualties exists doesn’t show that a military violates ethical standards. Shooting on rescues on the other hand is a violation of ethical standards.
From a military standpoint there’s an advantage to be gained by killing the doctors of the other side, from an ethical perspective it’s bad and there’s international law against it.
The US tries to maximize military objectives instead of ethical ones.