Pascal’s Mugging and the Order of Quantification
One of the fun things to do when learning first order logic is to consider how the meaning of propositions dramatically changes based on small switches in the syntax. This is in contrast to natural language, where the meaning of a phrase can be ambiguous and we naturally use context clues to determine the correct interperation.
An example of this is the switching of the order of quantifiers. Consider the four following propositions:[1]
These mean, respectively,
Everybody likes somebody
Everybody is liked by somebody
There is a very popular person whom everybody likes
There is a very indiscriminate person who likes everyone
These all have quite different meanings! Now consider an exchange between Pascal and a mugger:
Mugger: I am in control of this simulation and am using an avatar right now. Give me $5 or I will go outside of this simulation and cause you to lose $10.
Pascal: That does not seem like a wise choice for me. Let’s say the probability of you being in control of this simulation is 1⁄10. Then the expected value of me handing you $5 is negative, so I will not do so.
Mugger: Okay then, forget about the money, let’s just focus on units of utility and call these utils. I am going to demand that you give me 5 utils. Because you think that the probability of me being in control of this simulation is 1⁄10, I will go outside of the simulation and simulate 20 conscious agents losing 5 utils each. Now the expected value of you not complying is . And −10 is greater (more negative) than the −5 you would lose. So, hand me over the utils!
Pascal: Well, I just made up 1⁄10 on the spot. The reality is that the probability you are in control of this simulation is in reality much much lower.
Mugger: Ah, but it does not really matter what you probability you assign! Given any probability you have that I am in control of this simulation, I will go outside of this simulation and simulate so that is greater (more negative) than the −5 you would lose. So, hand me over the utils! ( here is the totat amout of lost utility the mugger will simulate.)
Now, let’s formalize the mugger’s argument and see if we can learn something. The mugger is claiming:
.
I propose that there is a simple counterargument to make here: cannot be quantified before , and in fact should be a function of . No matter what is, we assert that where is such a small amount of util that it cannot even be meaningfully handed over by Pascal. That is,
Why is this plausible? Well, we are basically saying that the larger the amount of util the mugger claims he can simulate, the less a chance there is he can actually do that.
If I told you I had an easy way to make $20 if you gave me $10 then you might believe me. If I told you I had an easy way to make $200 if you gave me $10 then you would be more skeptical. If I told you I had an easy way to make $2,000,00 if you gave me $10 then you would dismiss me without a second thought.
In the mugger’s case specifically, we would imagine that when he pops outside of this simulation he has a finite amount of resources, and so the more he claims to be able to simulate the less probable that is actually true.
- ^
From Language, Proof, and Logic 2nd Edition
But does the probability decrease fast enough?
Yeah, that’s the question. Saying that |−P(X)⋅X|<ϵ means that P(X)<ϵX. So if X doubles, then it’s required that P is at least cut in half. I doubt there is a proof of this per se, but in a situation as strange as this it seems reasonable to me that if you claim you can do 10 times as much of something, then that is at least 10 times less likely.
I guess the main point I wanted to make is that in the usual phrasings of Pascal’s Mugging the choice of X is oftentimes taken after the choice of P. But P should be a function of X. So the mugger at least has to include this in his argument, and (some of) the burden of proof is on him.
Yeah, since I learned about it, I always thought this was the obviously correct solution to Pascal’s mugging. But for some reason it was rarely mentioned in the past, as far as I know.