I just realized it’s possible to explain people picking dust in the torture vs. dust specks question using only scope insensitivity and no other mistakes. I’m sure that’s not original, but I bet this is what’s going on in the head of a normal person when they pick the specks.
The dust speck “dillema”—like a lot of the other exercises that get the mathematically wrong answer from most people is triggering a very valuable heuristic. - The “you are trying to con me into doing evil, so fuck off” Heuristic. Consider the problem as you would of it was a problem you were presented with in real life.
The negative utility of the “Torture” choice is nigh-100% certain. It is in your physical presence, you can verify it, and “one person gets tortured” is the kind of event that happens in real life with depressing frequency. The “Billions of people get exposed to very minor annoyance” choice? How is that causal chain supposed to work, anyway? So that choice gets assigned a very high probability of being a lie.
And it is the kind of lie people encounter very frequently. False hypotheticals in which large numbers of people suffer if you do not take a certain action are a common lever for cons. From a certain perspective, this is what religion is—Attempts to hack people’s utility functions by inserting so absurdly large numbers into the equations so that if you assign any probability at all to them being true they become dominant.
So claims that look like this class of attack routinely get assigned a probability of zero unless they have very strong evidence backing them up because that is the only way to defend against this kind of mental malware.
This is essentially an outside-an-argument argument. If we really had a choice between 50 years of torture and 3^^^3 dust specks, the rational choice would be the 50 years of torture. But the probability of this description of the situation being true, is extremely low.
If you, as a human, in a real-life situation believe that you are choosing between 50 years of torture and 3^^^3 dust specks, almost certainly you are confused or insane. There will not be the 3^^^3 dust specks, regardless of whichever clever argument has convinced you so; you are choosing between an imaginary amount of dust specks and probably a real torture, in which case you should be against the torture.
The only situations where you can find this dilemma in real life are the “Pascal’s mugging” scenarios. Imagine that you want to use glasses to protect your eyes, and your crazy neighbor tells you: “I read in my horoscope today that if you use those glasses, a devil will torture you for 50 years”. You estimate the probability of this to be very low, so you use the glasses despite the warning. But as we know, the probability is never literally zero—you chose avoiding some dust specks in exchange for maybe 1/3^^^3 chance of being tortured for 50 years. And this is a choice reasonable people do all the time.
Summary: In real life it is unlikely to encounter extremely large numbers, so we should be suspicious about them. But it is not unlikely to encounter extremely small probabilities. Mathematically, this is equivalent, but our intuitions say otherwise.
It probably goes like this: “Well, 3^^^3 is a big number; something like 100. Would I torture a person to prevent 100 people having a dust speck in their eyes? How about 200 or 1000? No, this is obviously a madness.”
Yes. This. Whenever I talk with anyone about the Torture vs. Dust Specks problem, I constantly see them falling into this trap. See, for instance, this discussion post from a few months back, and my reply to it.
This happens again and again, and by this point I am pretty sure that the whole problem boils down to just this.
I just realized it’s possible to explain people picking dust in the torture vs. dust specks question using only scope insensitivity and no other mistakes. I’m sure that’s not original, but I bet this is what’s going on in the head of a normal person when they pick the specks.
The dust speck “dillema”—like a lot of the other exercises that get the mathematically wrong answer from most people is triggering a very valuable heuristic. - The “you are trying to con me into doing evil, so fuck off” Heuristic.
Consider the problem as you would of it was a problem you were presented with in real life.
The negative utility of the “Torture” choice is nigh-100% certain. It is in your physical presence, you can verify it, and “one person gets tortured” is the kind of event that happens in real life with depressing frequency. The “Billions of people get exposed to very minor annoyance” choice? How is that causal chain supposed to work, anyway? So that choice gets assigned a very high probability of being a lie.
And it is the kind of lie people encounter very frequently. False hypotheticals in which large numbers of people suffer if you do not take a certain action are a common lever for cons. From a certain perspective, this is what religion is—Attempts to hack people’s utility functions by inserting so absurdly large numbers into the equations so that if you assign any probability at all to them being true they become dominant.
So claims that look like this class of attack routinely get assigned a probability of zero unless they have very strong evidence backing them up because that is the only way to defend against this kind of mental malware.
This is essentially an outside-an-argument argument. If we really had a choice between 50 years of torture and 3^^^3 dust specks, the rational choice would be the 50 years of torture. But the probability of this description of the situation being true, is extremely low.
If you, as a human, in a real-life situation believe that you are choosing between 50 years of torture and 3^^^3 dust specks, almost certainly you are confused or insane. There will not be the 3^^^3 dust specks, regardless of whichever clever argument has convinced you so; you are choosing between an imaginary amount of dust specks and probably a real torture, in which case you should be against the torture.
The only situations where you can find this dilemma in real life are the “Pascal’s mugging” scenarios. Imagine that you want to use glasses to protect your eyes, and your crazy neighbor tells you: “I read in my horoscope today that if you use those glasses, a devil will torture you for 50 years”. You estimate the probability of this to be very low, so you use the glasses despite the warning. But as we know, the probability is never literally zero—you chose avoiding some dust specks in exchange for maybe 1/3^^^3 chance of being tortured for 50 years. And this is a choice reasonable people do all the time.
Summary: In real life it is unlikely to encounter extremely large numbers, so we should be suspicious about them. But it is not unlikely to encounter extremely small probabilities. Mathematically, this is equivalent, but our intuitions say otherwise.
If you were presented it in real life, there would be a lot fewer than 3^^^3 people at stake. You don’t need to consider certainty.
50 years of torture is also something you won’t encounter in real life, but you can get a lot closer.
It probably goes like this: “Well, 3^^^3 is a big number; something like 100. Would I torture a person to prevent 100 people having a dust speck in their eyes? How about 200 or 1000? No, this is obviously a madness.”
Yes. This. Whenever I talk with anyone about the Torture vs. Dust Specks problem, I constantly see them falling into this trap. See, for instance, this discussion post from a few months back, and my reply to it.
This happens again and again, and by this point I am pretty sure that the whole problem boils down to just this.