Construct a thought experiment in which every single one of those 3^^^3 is asked whether he would accept a dust speck in the eye to save someone from being tortured, take the answers as a vote. If the majority would deem it personally acceptable, then acceptable it is.
This doesn’t work at all. If you ask each of them to make that decision you are asking to compare their one dust speck, with somebody else’s one instance of torture. Comparing 1 dust speck with torture 3^^^3 times is not even remotely the same as comparing 3^^^3 dust specks with torture.
If you ask me whether 1 is greater than 3 I will say no. If you ask me 5 times I will say no every time. But if you ask me whether 5 is greater than 3 I will say yes.
The only way to make it fair would be to ask them to compare themselves and the other 3^^^3 − 1 getting dust specks with torture, but I don’t see why asking them should get you a better answer than asking anyone else.
Compare two scenarios: in the first, the vote is on whether every one of the 3^^^3 people are dust-specked or not. In the second, only those who vote in favour are dust-specked, and then only if there’s a majority. But these are kind of the same scenario: what’s at stake in the second scenario is at least half of 3^^^3 dust-specks, which is about the same as 3^^^3 dust-specks. So the question “would you vote in favour of 3^^^3 people, including yourself, being dust-specked?” is the same as “would you be willing to pay one dust-speck in your eye to save a person from 50 years of torture, conditional on about 3^^^3 other people also being willing?”
Let me try and get this straight, you are presenting me with a number of moral dilemmas and asking me what I would do in them.
1) Me and 3^^^^3 − 1 other people all vote on whether we get dust specks in the eye or some other person gets tortured.
I vote for torture. It is astonishingly unlikely that my vote will decide, but if it doesn’t then it doesn’t matter what I vote, so the decision is just the same as if it was all up to me.
2) Me and 3^^^^3 − 1 other people all vote on whether everyone who voted for this option gets a dust speck in the eye or some other person gets tortured.
This is a different dilemma, since I have to weigh up three things instead of two, the chance that my vote will save about 3^^^^3 people from being dust-specked if I vote for torture, the chance that my vote will save on person from being tortured if I vote for dust specks and the (much higher) chance that my vote will save me and only me from being dust-specked if I vote for torture.
I remember reading somewhere that the chance of my vote being decisive in such a situation is roughly proportional to the square root of the number of people (please correct me if this is wrong). Assuming this is the case then I still vote for torture, since the term for saving everyone else from dust specks still dwarfs the other two.
3) I have to choose whether I will receive a dust speck or whether someone else will be tortured, but my decision doesn’t matter unless at least half of 3^^^^3 − 1 other people would be willing to choose the dust speck.
Once again the dilemma has changed, this time I have lost my ability to save other people from dust specks and the probability of me successfully saving someone from torture has massively increased. I can safely ignore the case where the majority of others choose torture, since my decision doesn’t matter then. Given that the others choose dust specks, I am not so selfish as to save myself from a dust speck rather than someone else from torture.
You try to make it look like scenarios 2 and 3 are the same, but they are actually very, very different.
The bottom line is that no amount of clever wrangling you do with votes or conditionals can turn 3^^^^3 people into one person. If it could, I would be very worried, since it would imply that the number of people you harm doesn’t matter, only the amount of harm you do. In other words, if I’m offered the choice between one person dying and ten people dying, then it doesn’t matter which I pick.
Assuming a roughly 50-50 split the inverse square-root rule is right. Now my issue is why you incorporate that factor in scenario 2, but not scenario 3. I honestly thought I was just rephrasing the problem, but you seem to see it differently? I should clarify that this isn’t you unconditionally receiving a speck if you’re willing to, but only if half the remainder are also so willing.
The point of voting, for me, is not an attempt to induce scope insensitivity by personalizing the decision, but to incorporate the preferences of the vast majority (3^^^^3 out of 3^^^^3 + 1) of participants about the situation they find themselves in, into your calculation of what to do. The Torture vs. Specks problem in its standard form asks for you to decide on behalf of 3^^^^3 people what should happen to them; voting is a procedure by which they can decide.
[Edit: On second thought, I retract my assertion that scenario 1) and 2) have roughly the same stakes. That in scenario 1) huge numbers of people who prefer not to be dust-specked can get dust-specked, and in scenario 2) no one who prefers not to be dust-specked is dust-specked, makes much more of a difference than a simple doubling of the number of specks.]
By the way, the problem as stated involves 3^^^3, not 3^^^^3, people, but this can’t possibly matter so nevermind.
There are actually two differences between 2 and 3. The first is that in 2 my chance of affecting the torture is negligible, whereas in 3 it is quite high. The second difference is that in 2 I have the power to save huge numbers of others from dust specks, and it is this difference which is important to me, since when I have that power it dwarfs the other factors so much as to be the only deciding factor in my decision. In your ‘rephrasing’ of it you conveniently ignore the fact that I can still do this, so I assumed I no longer could, which made the two scenarios very different.
I think also, as a general principle, any argument of the type you are formulating which does not pay attention to the specific utilities of torture and dust-specks, instead just playing around with who makes the decision, can also be used to justify killing 3^^^^3 people to save one person from being killed in a slightly more painful manner.
How about each of those 3^^^3 is asked whether they would accept a dust speck in the eye to save someone from 1/3^^^3 of 50 years of torture, and everyone’s choice is granted? (i.e. the ones who say they’d accept a dust speck get a dust speck, and the person is tortured for an amount of time proportional to the number of people who refused.)
I’m not quite sure what I’d expect to have happen in that case. That’s harder than the moral question because we have to imagine a world that actually contains 3^^^3 different (i.e. not perfectly decision-theoretically correlated) people, and any kind of projection about that kind of world would pretty much be making stuff up. But as for the moral question of what a person in this situation should say, I’d say the reasoning is about the same — getting a dust speck in your eye is worse than 50/3^^^3 years of torture, so refuse the speck.
(That’s actually an interesting way of looking at it, because we could also put it in terms of each person choosing whether they get specked or they themselves get tortured for 50/3^^^3 years, in which case the choice is really obvious — but if you’re still working with 3^^^3 people, and they all go with the infinitesimal moment of torture, that still adds up to a total 50 years of torture.)
Edit: Actually, for that last scenario, forget 50/3^^^3 years, that’s way less than a Planck interval. So let’s instead multiply it by enough for it to be noticeable to a human mind, and reduce the intensity of the torture by the same factor.
The point of Torture vs. Dust Specks is that our moral intuition dramatically conflicts with strict utilitarianism.
Your thought experiment helps express your moral intuition, but it doesn’t do anything to resolve the conflict.
Although, come to think of it, I think there’s an argument to be made that the majority would answer no. If we interpret 3^^^3 people to mean qualitatively distinct individuals, there’s not enough room in humanspace for all of those people to be human—the vast majority will be nonhumans. It can be argued, at least, that if you pick a random nonhuman individual, that individual will not be altruistic towards humans.
Construct a thought experiment in which every single one of those 3^^^3 is asked whether he would accept a dust speck in the eye to save someone from being tortured, take the answers as a vote. If the majority would deem it personally acceptable, then acceptable it is.
This doesn’t work at all. If you ask each of them to make that decision you are asking to compare their one dust speck, with somebody else’s one instance of torture. Comparing 1 dust speck with torture 3^^^3 times is not even remotely the same as comparing 3^^^3 dust specks with torture.
If you ask me whether 1 is greater than 3 I will say no. If you ask me 5 times I will say no every time. But if you ask me whether 5 is greater than 3 I will say yes.
The only way to make it fair would be to ask them to compare themselves and the other 3^^^3 − 1 getting dust specks with torture, but I don’t see why asking them should get you a better answer than asking anyone else.
Compare two scenarios: in the first, the vote is on whether every one of the 3^^^3 people are dust-specked or not. In the second, only those who vote in favour are dust-specked, and then only if there’s a majority. But these are kind of the same scenario: what’s at stake in the second scenario is at least half of 3^^^3 dust-specks, which is about the same as 3^^^3 dust-specks. So the question “would you vote in favour of 3^^^3 people, including yourself, being dust-specked?” is the same as “would you be willing to pay one dust-speck in your eye to save a person from 50 years of torture, conditional on about 3^^^3 other people also being willing?”
Let me try and get this straight, you are presenting me with a number of moral dilemmas and asking me what I would do in them.
1) Me and 3^^^^3 − 1 other people all vote on whether we get dust specks in the eye or some other person gets tortured.
I vote for torture. It is astonishingly unlikely that my vote will decide, but if it doesn’t then it doesn’t matter what I vote, so the decision is just the same as if it was all up to me.
2) Me and 3^^^^3 − 1 other people all vote on whether everyone who voted for this option gets a dust speck in the eye or some other person gets tortured.
This is a different dilemma, since I have to weigh up three things instead of two, the chance that my vote will save about 3^^^^3 people from being dust-specked if I vote for torture, the chance that my vote will save on person from being tortured if I vote for dust specks and the (much higher) chance that my vote will save me and only me from being dust-specked if I vote for torture.
I remember reading somewhere that the chance of my vote being decisive in such a situation is roughly proportional to the square root of the number of people (please correct me if this is wrong). Assuming this is the case then I still vote for torture, since the term for saving everyone else from dust specks still dwarfs the other two.
3) I have to choose whether I will receive a dust speck or whether someone else will be tortured, but my decision doesn’t matter unless at least half of 3^^^^3 − 1 other people would be willing to choose the dust speck.
Once again the dilemma has changed, this time I have lost my ability to save other people from dust specks and the probability of me successfully saving someone from torture has massively increased. I can safely ignore the case where the majority of others choose torture, since my decision doesn’t matter then. Given that the others choose dust specks, I am not so selfish as to save myself from a dust speck rather than someone else from torture.
You try to make it look like scenarios 2 and 3 are the same, but they are actually very, very different.
The bottom line is that no amount of clever wrangling you do with votes or conditionals can turn 3^^^^3 people into one person. If it could, I would be very worried, since it would imply that the number of people you harm doesn’t matter, only the amount of harm you do. In other words, if I’m offered the choice between one person dying and ten people dying, then it doesn’t matter which I pick.
Assuming a roughly 50-50 split the inverse square-root rule is right. Now my issue is why you incorporate that factor in scenario 2, but not scenario 3. I honestly thought I was just rephrasing the problem, but you seem to see it differently? I should clarify that this isn’t you unconditionally receiving a speck if you’re willing to, but only if half the remainder are also so willing.
The point of voting, for me, is not an attempt to induce scope insensitivity by personalizing the decision, but to incorporate the preferences of the vast majority (3^^^^3 out of 3^^^^3 + 1) of participants about the situation they find themselves in, into your calculation of what to do. The Torture vs. Specks problem in its standard form asks for you to decide on behalf of 3^^^^3 people what should happen to them; voting is a procedure by which they can decide.
[Edit: On second thought, I retract my assertion that scenario 1) and 2) have roughly the same stakes. That in scenario 1) huge numbers of people who prefer not to be dust-specked can get dust-specked, and in scenario 2) no one who prefers not to be dust-specked is dust-specked, makes much more of a difference than a simple doubling of the number of specks.]
By the way, the problem as stated involves 3^^^3, not 3^^^^3, people, but this can’t possibly matter so nevermind.
There are actually two differences between 2 and 3. The first is that in 2 my chance of affecting the torture is negligible, whereas in 3 it is quite high. The second difference is that in 2 I have the power to save huge numbers of others from dust specks, and it is this difference which is important to me, since when I have that power it dwarfs the other factors so much as to be the only deciding factor in my decision. In your ‘rephrasing’ of it you conveniently ignore the fact that I can still do this, so I assumed I no longer could, which made the two scenarios very different.
I think also, as a general principle, any argument of the type you are formulating which does not pay attention to the specific utilities of torture and dust-specks, instead just playing around with who makes the decision, can also be used to justify killing 3^^^^3 people to save one person from being killed in a slightly more painful manner.
How about each of those 3^^^3 is asked whether they would accept a dust speck in the eye to save someone from 1/3^^^3 of 50 years of torture, and everyone’s choice is granted? (i.e. the ones who say they’d accept a dust speck get a dust speck, and the person is tortured for an amount of time proportional to the number of people who refused.)
I’m not quite sure what I’d expect to have happen in that case. That’s harder than the moral question because we have to imagine a world that actually contains 3^^^3 different (i.e. not perfectly decision-theoretically correlated) people, and any kind of projection about that kind of world would pretty much be making stuff up. But as for the moral question of what a person in this situation should say, I’d say the reasoning is about the same — getting a dust speck in your eye is worse than 50/3^^^3 years of torture, so refuse the speck.
(That’s actually an interesting way of looking at it, because we could also put it in terms of each person choosing whether they get specked or they themselves get tortured for 50/3^^^3 years, in which case the choice is really obvious — but if you’re still working with 3^^^3 people, and they all go with the infinitesimal moment of torture, that still adds up to a total 50 years of torture.)
Edit: Actually, for that last scenario, forget 50/3^^^3 years, that’s way less than a Planck interval. So let’s instead multiply it by enough for it to be noticeable to a human mind, and reduce the intensity of the torture by the same factor.
The point of Torture vs. Dust Specks is that our moral intuition dramatically conflicts with strict utilitarianism.
Your thought experiment helps express your moral intuition, but it doesn’t do anything to resolve the conflict.
Although, come to think of it, I think there’s an argument to be made that the majority would answer no. If we interpret 3^^^3 people to mean qualitatively distinct individuals, there’s not enough room in humanspace for all of those people to be human—the vast majority will be nonhumans. It can be argued, at least, that if you pick a random nonhuman individual, that individual will not be altruistic towards humans.