“Would you rather experience the mild distraction of a dust speck in your eye, or allow someone else to be tortured for fifty years?”
“Would you rather be tortured for fifty years, or have someone else experience the mild discomfort of a dust speck in their eye?”
Asking this question to (let’s say) humans will cause them to believe that only one person is getting the dust speck in the eye. Of course they’re going to come up with the wrong answer if they have incomplete information.
Now you have to make your decisions knowing that the entire universe totally wouldn’t mind having dust specks in exchange for preventing suffering for one other person. If that doesn’t change your decision … something is wrong.
There are two problems with this. The first is that if you take a number of people as big as 3^^^3 and ask them all this question, an incomprehensibly huge number will prefer to torture the other guy. These people will be insane, demented, cruel, or dreaming or whatever, but according to your ethics they must be taken into account. (And according to mine, as well, actually). The number of people saying to torture the guy will be greater than the number of Planck lengths in the observable universe. That alone is enough disutility to say “Torture away!”
The other problem is that you assert that “something is wrong” when my decision remains the same after a wrong question is asked with incomplete information 3^^^3 times does not change my decision. What is wrong? I can tell you that your intuition that “something must be wrong” is just incorrect. Nothing is wrong with the decision. (And this paragraph is for the LCPW where everyone answered selflessly to the question, which is of course not even remotely plausible).
Asking this question to (let’s say) humans will cause them to believe that only one person is getting the dust speck in the eye. Of course they’re going to come up with the wrong answer if they have incomplete information.
There are two problems with this. The first is that if you take a number of people as big as 3^^^3 and ask them all this question, an incomprehensibly huge number will prefer to torture the other guy. These people will be insane, demented, cruel, or dreaming or whatever, but according to your ethics they must be taken into account. (And according to mine, as well, actually). The number of people saying to torture the guy will be greater than the number of Planck lengths in the observable universe. That alone is enough disutility to say “Torture away!”
The other problem is that you assert that “something is wrong” when my decision remains the same after a wrong question is asked with incomplete information 3^^^3 times does not change my decision. What is wrong? I can tell you that your intuition that “something must be wrong” is just incorrect. Nothing is wrong with the decision. (And this paragraph is for the LCPW where everyone answered selflessly to the question, which is of course not even remotely plausible).