I watched “Eye in the Sky” this past week, and ended up having a large argument with a friend after.
Story follows the UK Army following most-wanted terrorists in Kenya using a drone in the sky. They follow them into a house where they start preparing suicide vests. Plan turns into a remote drone strike, but the pilot keeps delaying as there is a young girl outside the house.
Essentially, a story line similar to the trolley problem—do you (potentially) save 1 innocent girl’s life, or potentially watch terrorists attack a crowded place (film estimated 80 deaths).
I found it really hard to sympathise with the “wait and save girl” argument—the moral conflict here is fairly small, and could have been made worse in the film. Friend disagreed saying what they did was wrong.
“Wrong” means not only different things but different kinds of things to different people.
To a consequentialist it means “has bad consequences” or maybe “is reasonably expected to have bad consequences”.
To a deontologist it means “breaks moral rules”.
Note that those moral rules may be moral rules because a policy of not breaking them has good consequences, and (because most of us are not good at making difficult decisions in real time) it may actually have better consequences than a policy of always trying to work out the consequences case by case.
To a virtue ethicist it means “is the kind of thing a good person wouldn’t do”.
Note that a policy if making oneself a good person (perhaps defined in some kinda-consequentialist terms) and then acting in the ways that come naturally may again have better consequences than trying to work things out case by case (same reason as above), and perhaps also better consequences than following any manageable set of rules (because well-trained human judgement may do better at capturing what you care about than any set of rules simple enough to follow).
I am guessing (from what you say, and from the fact that most people here are more or less consequentialists) that you are a consequentialist; from that perspective, indeed, blowing up the building (or whatever exactly the drone would have done) seems like a clear win.
But perhaps your friend is a deontologist: s/he has a rule “you’re not allowed to kill civilians” and wants it followed in all cases. That will give suboptimal results sometimes, and maybe this case is an example. But it may still be a better policy than “think it out from first principles in every case”. For instance, suppose—as seems pretty plausible, though I don’t know—that drone operators quite often face the possibility of collateral damage, and that in most cases they could avoid killing civilians (without much compromise to military objectives) by taking some extra trouble: waiting a bit, observing for longer, etc. Then if “you’re not allowed to kill civilians” they will take that extra trouble, but in the absence of such a clear-cut rule they may be strongly motivated to find excuses for why, in each individual case, it’s better just to go ahead and accept the civilian deaths. (And there’s a feedback loop here; do that often enough and you’re likely to find yourself caring less about civilian deaths, perhaps even finding rationalizations for why they’re a good thing.)
Or perhaps your friend is a virtue ethicist: good people find it really hard to kill innocent bystanders, so a really good person wouldn’t carry out the strike and kill the girl (even if they agreed that in principle it would be for the best; they just psychologically couldn’t do it); therefore a drone operator who just goes ahead and does it is thereby shown not to be a good person, and that’s why they shouldn’t do it. The consequences of being a Good Person in this particular case may be bad—but a world of Good People would probably have a lot fewer situations in which that kind of decision had to be made in the first place.
Me, I’m pretty much a consequentialist, but I’m consequentialist about policies as well as about individual actions, and I’d at least want to consider a fairly strict no-killing-civilians policy of the sort that would forbid this action. (But I think what I would actually prefer is a policy that almost forbids such things and allows exceptions in really clear-cut cases. I haven’t seen “Eye in the Sky” and therefore have no idea whether this was one.)
One other remark: this sort of drama always makes me uncomfortable, because it enables the people making it to manipulate viewers’ moral intuitions. Case 1: they show lots of cases where this kind of dilemma arises, and in every case it becomes clear that the drone operator should have taken the “tough” line and accepted civilian casualties For The Greater Good. Case 2: they show lots of cases where this kind of dilemma arises, and in every case it becomes clear that the drone operator should have taken the “nice” line because they could have accomplished their objectives without killing civilians. -- Politicians are highly susceptible to public opinion. Do we really want the makers of movies and TV dramas determining (indirectly) national policy on this kind of thing?
(I am not suggesting that they should be forbidden to do it, or anything like that. That would probably be much worse. It just makes me uncomfortable that this happens.)
Finding the hardest to argue against are the deontologists. Morality is a hard one to pin down and define, but my original thought process still holds up here.
“you’re not allowed to kill civilians”
Unless moral objectives are black and white, we can assign a badness to each. Killing and allowing death are subtly different to most people, but not to the chime of 80 people. In both cases, you will kill civilians—and in that light, the problem becomes a minimisation one. I still would then say that inaction is less moral than action in the above situation.
drone operators quite often face the possibility of collateral damage, and that in most cases they could avoid killing civilians (without much compromise to military objectives) by taking some extra trouble: waiting a bit, observing for longer, etc.
Civilian death is acceptably bad (to everyone) and to be minimised—if waiting doesn’t jeopardise the mission, then minimise away. This was a big part of the film, but it got to a point where they could no longer wait. There is a call to be made—will waiting actually bring us anywhere, or are we delaying the inevitable at a risk to the mission. (The civilian in the film was a young girl selling bread. She had a load of loafs to sell.)
This opens up a whole other can of worms. Is it worth waiting to minimise civilian deaths at the chance to fail the mission?
Then if “you’re not allowed to kill civilians” they will take that extra trouble, but in the absence of such a clear-cut rule they may be strongly motivated to find excuses for why, in each individual case, it’s better just to go ahead and accept the civilian deaths.
The danger of thinking in such a clear cut way (as a person or as an organisation) is ignoring the cases where inaction is worse. Nobody likes to “kill civilians” and making up a silly rule that frees you the responsibility of doing so does not make the situation better. Your rule should not be “never kill civilians” or “kill target no matter what, ignoring civilian deaths” but “minimise civilian casualties in any possible manner”.
Or perhaps your friend is a virtue ethicist: good people find it really hard to kill innocent bystanders
I think I’d have many arguments (ehrm—discussions) with a friends like that.
From the drone drivers perspective—Not sure an organisation would hire a virtue ethicist drone pilot. Somewhat defeats the purpose. “Spying on people is always bad”?
One other remark: this sort of drama always makes me uncomfortable, because it enables the people making it to manipulate viewers’ moral intuitions. Case 1: they show lots of cases where this kind of dilemma arises, and in every case it becomes clear that the drone operator should have taken the “tough” line and accepted civilian casualties For The Greater Good. Case 2: they show lots of cases where this kind of dilemma arises, and in every case it becomes clear that the drone operator should have taken the “nice” line because they could have accomplished their objectives without killing civilians. -- Politicians are highly susceptible to public opinion. Do we really want the makers of movies and TV dramas determining (indirectly) national policy on this kind of thing?
I thought something similar, actually. I think overall, films that properly convey the issue at hand are a good thing. The film talked about the conflict above, as well as some intra-country disputes (USA vs UK vs Kenya) and media issues (what would the public think).
Sure, this might change the view of many people. But the media is already filled with opinionated content on air strikes and foreign warfare. You’re not going to remove opinion, but perhaps forcing 90 minutes of debate on to someone is the next best thing.
Your rule should not be “never kill civilians” or “kill target no matter what, ignoring civilian deaths” but “minimise civilian casualties in any possible manner”.
Depends on your computing power.
For example, choosing “minimise civilian casualties in any possible manner” may encourage your opponent to take hostages they wouldn’t take if you would precommit to “kill target no matter what, ignoring civilian deaths”. If taking hostages makes crime relatively safe and profitable, this may encourage more wannabe criminals to take action. Thus, minimising the casualties in short term may increase the casualties in long term.
Also, it’s important how much your actions are legible by your opponent, and how credible are your precommitments.
For example, if you choose the strategy “kill target no matter what, ignoring civilian deaths”, but your opponent believes that you would follow the strategy if there are 10 hostages, but that you would probably change your mind if there are 10 000 hostages, well, you just motivated them to take 10 000 hostages.
(Then there are strategies to ruin your opponent’s precommitment. Essentially, if your opponent precommits to “if X, then I do Y”, your strategy is to do things that are very similar to X, but not completely X. You keep doing this, and while you technically didn’t do X, only “X minus epsilon”, so your opponent was not required to do Y, psychologically you weaken the credibility of their precommitment, because for most people it is difficult to believe that “X minus epsilon” doesn’t bring the strong reaction Y, but X would.)
I watched “Eye in the Sky” this past week, and ended up having a large argument with a friend after.
Story follows the UK Army following most-wanted terrorists in Kenya using a drone in the sky. They follow them into a house where they start preparing suicide vests. Plan turns into a remote drone strike, but the pilot keeps delaying as there is a young girl outside the house.
Essentially, a story line similar to the trolley problem—do you (potentially) save 1 innocent girl’s life, or potentially watch terrorists attack a crowded place (film estimated 80 deaths).
I found it really hard to sympathise with the “wait and save girl” argument—the moral conflict here is fairly small, and could have been made worse in the film. Friend disagreed saying what they did was wrong.
Am I missing something?
“Wrong” means not only different things but different kinds of things to different people.
To a consequentialist it means “has bad consequences” or maybe “is reasonably expected to have bad consequences”.
To a deontologist it means “breaks moral rules”.
Note that those moral rules may be moral rules because a policy of not breaking them has good consequences, and (because most of us are not good at making difficult decisions in real time) it may actually have better consequences than a policy of always trying to work out the consequences case by case.
To a virtue ethicist it means “is the kind of thing a good person wouldn’t do”.
Note that a policy if making oneself a good person (perhaps defined in some kinda-consequentialist terms) and then acting in the ways that come naturally may again have better consequences than trying to work things out case by case (same reason as above), and perhaps also better consequences than following any manageable set of rules (because well-trained human judgement may do better at capturing what you care about than any set of rules simple enough to follow).
I am guessing (from what you say, and from the fact that most people here are more or less consequentialists) that you are a consequentialist; from that perspective, indeed, blowing up the building (or whatever exactly the drone would have done) seems like a clear win.
But perhaps your friend is a deontologist: s/he has a rule “you’re not allowed to kill civilians” and wants it followed in all cases. That will give suboptimal results sometimes, and maybe this case is an example. But it may still be a better policy than “think it out from first principles in every case”. For instance, suppose—as seems pretty plausible, though I don’t know—that drone operators quite often face the possibility of collateral damage, and that in most cases they could avoid killing civilians (without much compromise to military objectives) by taking some extra trouble: waiting a bit, observing for longer, etc. Then if “you’re not allowed to kill civilians” they will take that extra trouble, but in the absence of such a clear-cut rule they may be strongly motivated to find excuses for why, in each individual case, it’s better just to go ahead and accept the civilian deaths. (And there’s a feedback loop here; do that often enough and you’re likely to find yourself caring less about civilian deaths, perhaps even finding rationalizations for why they’re a good thing.)
Or perhaps your friend is a virtue ethicist: good people find it really hard to kill innocent bystanders, so a really good person wouldn’t carry out the strike and kill the girl (even if they agreed that in principle it would be for the best; they just psychologically couldn’t do it); therefore a drone operator who just goes ahead and does it is thereby shown not to be a good person, and that’s why they shouldn’t do it. The consequences of being a Good Person in this particular case may be bad—but a world of Good People would probably have a lot fewer situations in which that kind of decision had to be made in the first place.
Me, I’m pretty much a consequentialist, but I’m consequentialist about policies as well as about individual actions, and I’d at least want to consider a fairly strict no-killing-civilians policy of the sort that would forbid this action. (But I think what I would actually prefer is a policy that almost forbids such things and allows exceptions in really clear-cut cases. I haven’t seen “Eye in the Sky” and therefore have no idea whether this was one.)
One other remark: this sort of drama always makes me uncomfortable, because it enables the people making it to manipulate viewers’ moral intuitions. Case 1: they show lots of cases where this kind of dilemma arises, and in every case it becomes clear that the drone operator should have taken the “tough” line and accepted civilian casualties For The Greater Good. Case 2: they show lots of cases where this kind of dilemma arises, and in every case it becomes clear that the drone operator should have taken the “nice” line because they could have accomplished their objectives without killing civilians. -- Politicians are highly susceptible to public opinion. Do we really want the makers of movies and TV dramas determining (indirectly) national policy on this kind of thing?
(I am not suggesting that they should be forbidden to do it, or anything like that. That would probably be much worse. It just makes me uncomfortable that this happens.)
Great response, thanks.
Finding the hardest to argue against are the deontologists. Morality is a hard one to pin down and define, but my original thought process still holds up here.
Unless moral objectives are black and white, we can assign a badness to each. Killing and allowing death are subtly different to most people, but not to the chime of 80 people. In both cases, you will kill civilians—and in that light, the problem becomes a minimisation one. I still would then say that inaction is less moral than action in the above situation.
Civilian death is acceptably bad (to everyone) and to be minimised—if waiting doesn’t jeopardise the mission, then minimise away. This was a big part of the film, but it got to a point where they could no longer wait. There is a call to be made—will waiting actually bring us anywhere, or are we delaying the inevitable at a risk to the mission. (The civilian in the film was a young girl selling bread. She had a load of loafs to sell.)
This opens up a whole other can of worms. Is it worth waiting to minimise civilian deaths at the chance to fail the mission?
The danger of thinking in such a clear cut way (as a person or as an organisation) is ignoring the cases where inaction is worse. Nobody likes to “kill civilians” and making up a silly rule that frees you the responsibility of doing so does not make the situation better. Your rule should not be “never kill civilians” or “kill target no matter what, ignoring civilian deaths” but “minimise civilian casualties in any possible manner”.
I think I’d have many arguments (ehrm—discussions) with a friends like that.
From the drone drivers perspective—Not sure an organisation would hire a virtue ethicist drone pilot. Somewhat defeats the purpose. “Spying on people is always bad”?
I thought something similar, actually. I think overall, films that properly convey the issue at hand are a good thing. The film talked about the conflict above, as well as some intra-country disputes (USA vs UK vs Kenya) and media issues (what would the public think).
Sure, this might change the view of many people. But the media is already filled with opinionated content on air strikes and foreign warfare. You’re not going to remove opinion, but perhaps forcing 90 minutes of debate on to someone is the next best thing.
Depends on your computing power.
For example, choosing “minimise civilian casualties in any possible manner” may encourage your opponent to take hostages they wouldn’t take if you would precommit to “kill target no matter what, ignoring civilian deaths”. If taking hostages makes crime relatively safe and profitable, this may encourage more wannabe criminals to take action. Thus, minimising the casualties in short term may increase the casualties in long term.
Also, it’s important how much your actions are legible by your opponent, and how credible are your precommitments.
For example, if you choose the strategy “kill target no matter what, ignoring civilian deaths”, but your opponent believes that you would follow the strategy if there are 10 hostages, but that you would probably change your mind if there are 10 000 hostages, well, you just motivated them to take 10 000 hostages.
(Then there are strategies to ruin your opponent’s precommitment. Essentially, if your opponent precommits to “if X, then I do Y”, your strategy is to do things that are very similar to X, but not completely X. You keep doing this, and while you technically didn’t do X, only “X minus epsilon”, so your opponent was not required to do Y, psychologically you weaken the credibility of their precommitment, because for most people it is difficult to believe that “X minus epsilon” doesn’t bring the strong reaction Y, but X would.)