I prefer dust specks because I insist on counting people one at a time. I think it’s obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so. Any of those 3^^^3 people who would not voluntarily do so, I don’t have enough sympathy for such individuals to step in on their behalf and spare them the dust speck.
I haven’t yet worked out a good way to draw the line in the escalation scenario, since I suspect that “whatever level of discomfort I, personally, wouldn’t voluntarily experience to save some random person from 50 years of torture” is unlikely to be the right answer.
I think it’s obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so.
Woah.
How’d they end up responsible for the choice you make? You can’t have it both ways, that’s not how it works.
My (unfinished, don’t ask for too much detail) ethical theory is based on rights, which can be waived by the would-be victim of an act that would otherwise be a rights violation. So in principle, if I could poll the 3^^^3 people, I would expect them to waive the right not to experience the dust specks. They aren’t responsible for what I do, but my expectations of their dispositions about my choice inform that choice.
Then the “real-world analogy” point in the post prompts me to ask a fun question: do you consider yourself entitled to rob everyone else for one penny to save one starving African child? Because if someone would have refused to pay up, you “don’t have enough sympathy for such individuals” and take the penny anyway.
Changing the example to one that involves money does wacky things to my intutions, especially since many people live in situations where a penny is not a trivial amount of money (whereas I take it that a dust speck in the eye is pretty much commensurate for everybody), and the fact that there are probably less expensive ways to save lives (so unlike the purely stipulated tradeoff of the dust speck/torture situation, I do not need a penny from everyone to save the starving child).
Thanks! It seems my question wasn’t very relevant to the original dilemma. I vaguely recall arguing with you about your ethical theory some months ago, so let’s not go there; but when you eventually finish that stuff, please post it here so we can all take a stab.
In the least convenient possible world: I take it that in this case, that world is the one where wealth is distributed equally enough that one penny means the same amount to everybody, and every cheaper opportunity to save a life has already been taken advantage of.
Why would a world that looked like that have a starving African child? If we all have X dollars, so a penny is worth the same to everyone, then doesn’t the starving African child also have X dollars? If he does, and X dollars won’t buy him dinner, then there just must not be any food in his region (because it doesn’t make any sense for people to sell food at a price that literally no one can afford, and everybody only has X dollars) - so X dollars plus (population x 1¢) probably wouldn’t help him either.
Perhaps you had a different inconvenient possible world in mind; can you describe it for me?
How’s that possible? The question is this: there is, say, a trillion people, each has exactly one cent to give away. If almost every one of them parts with their cent, one life gets saved, otherwise one life is lost. Each of these people can either give up their cent voluntarily, or you, personally, can rob them of that cent (say, you can implement some worldwide policy to do that in bulk). Do you consider it the right choice to rob every one of these people who refuse to pay up?
So you are enabled to choose dust specks based on your prediction that the 3^^^3 people will waive their rights. However, you “don’t have sympathy” for anyone who actually doesn’t. Therefore, you are willing to violate the rights of anyone who does not comply with your predicted ethical conclusion. What, then, if all 3^^^3 people refuse to waive their rights? Then you aren’t just putting a dust speck into the eyes of 3^^^3 people, you’re also violating their rights by your own admission. Doesn’t that imply a further compounding of disutility?
I don’t see how your ethical theory can possibly function if those who refuse to waive their rights have them stripped away as a consequence.
By the same argument (i.e. refusing to multiply), wouldn’t it also be better to torture 100 people for 49 years than to torture one person for 50 years?
Not if each of them considers it a wrong choice. Refusing to multiply goes both ways, and no math can debate this choice: whatever thought experiment you present, an intuitive response would be stumped on top and given as a reply.
I haven’t yet worked out a good way to draw the line in the escalation scenario, since I suspect that “whatever level of discomfort I, personally, wouldn’t voluntarily experience to save some random person from 50 years of torture” is unlikely to be the right answer.
The scenario you present is among those I have no suitable answer for for this reason. However, I lean towards preferring the 50 years of torture for 1 person over 49 years for 100.
I prefer dust specks because I insist on counting people one at a time. I think it’s obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so.
This is defection, a suboptimal strategy. Each person in isolation prefers to defect in Prisoner’s dilemma.
Any of those 3^^^3 people who would not voluntarily do so, I don’t have enough sympathy for such individuals to step in on their behalf and spare them the dust speck.
I prefer dust specks because I insist on counting people one at a time. I think it’s obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so. Any of those 3^^^3 people who would not voluntarily do so, I don’t have enough sympathy for such individuals to step in on their behalf and spare them the dust speck.
I haven’t yet worked out a good way to draw the line in the escalation scenario, since I suspect that “whatever level of discomfort I, personally, wouldn’t voluntarily experience to save some random person from 50 years of torture” is unlikely to be the right answer.
I think it’s obvious that any single person presented with the opportunity to save someone else from fifty years of torture by experiencing a dust speck in the eye ought to do so.
Woah.
How’d they end up responsible for the choice you make? You can’t have it both ways, that’s not how it works.
My (unfinished, don’t ask for too much detail) ethical theory is based on rights, which can be waived by the would-be victim of an act that would otherwise be a rights violation. So in principle, if I could poll the 3^^^3 people, I would expect them to waive the right not to experience the dust specks. They aren’t responsible for what I do, but my expectations of their dispositions about my choice inform that choice.
Then the “real-world analogy” point in the post prompts me to ask a fun question: do you consider yourself entitled to rob everyone else for one penny to save one starving African child? Because if someone would have refused to pay up, you “don’t have enough sympathy for such individuals” and take the penny anyway.
Changing the example to one that involves money does wacky things to my intutions, especially since many people live in situations where a penny is not a trivial amount of money (whereas I take it that a dust speck in the eye is pretty much commensurate for everybody), and the fact that there are probably less expensive ways to save lives (so unlike the purely stipulated tradeoff of the dust speck/torture situation, I do not need a penny from everyone to save the starving child).
Thanks! It seems my question wasn’t very relevant to the original dilemma. I vaguely recall arguing with you about your ethical theory some months ago, so let’s not go there; but when you eventually finish that stuff, please post it here so we can all take a stab.
You are not placing the question in the least convenient possible world.
In the least convenient possible world: I take it that in this case, that world is the one where wealth is distributed equally enough that one penny means the same amount to everybody, and every cheaper opportunity to save a life has already been taken advantage of.
Why would a world that looked like that have a starving African child? If we all have X dollars, so a penny is worth the same to everyone, then doesn’t the starving African child also have X dollars? If he does, and X dollars won’t buy him dinner, then there just must not be any food in his region (because it doesn’t make any sense for people to sell food at a price that literally no one can afford, and everybody only has X dollars) - so X dollars plus (population x 1¢) probably wouldn’t help him either.
Perhaps you had a different inconvenient possible world in mind; can you describe it for me?
One where the African child really does need that cent.
I’m afraid that isn’t enough detail for me to understand the question you’d like me to answer.
How’s that possible? The question is this: there is, say, a trillion people, each has exactly one cent to give away. If almost every one of them parts with their cent, one life gets saved, otherwise one life is lost. Each of these people can either give up their cent voluntarily, or you, personally, can rob them of that cent (say, you can implement some worldwide policy to do that in bulk). Do you consider it the right choice to rob every one of these people who refuse to pay up?
It sounds like in this possible world, I am a tax collector.
I think it is a suitable use of taxes to save starving people.
So you are enabled to choose dust specks based on your prediction that the 3^^^3 people will waive their rights. However, you “don’t have sympathy” for anyone who actually doesn’t. Therefore, you are willing to violate the rights of anyone who does not comply with your predicted ethical conclusion. What, then, if all 3^^^3 people refuse to waive their rights? Then you aren’t just putting a dust speck into the eyes of 3^^^3 people, you’re also violating their rights by your own admission. Doesn’t that imply a further compounding of disutility?
I don’t see how your ethical theory can possibly function if those who refuse to waive their rights have them stripped away as a consequence.
By the same argument (i.e. refusing to multiply), wouldn’t it also be better to torture 100 people for 49 years than to torture one person for 50 years?
Not if each of them considers it a wrong choice. Refusing to multiply goes both ways, and no math can debate this choice: whatever thought experiment you present, an intuitive response would be stumped on top and given as a reply.
I did say:
The scenario you present is among those I have no suitable answer for for this reason. However, I lean towards preferring the 50 years of torture for 1 person over 49 years for 100.
This is defection, a suboptimal strategy. Each person in isolation prefers to defect in Prisoner’s dilemma.
And this is preference for fuzzies over utility, inability to shut up and multiply.
If this is true, then by reductio, preference of utility is incorrect.