you also have no reason to carry on your business dealing with ordinary things
Yes I do. Dealing with ordinary things has a positive expected utility. Analysing anything that looks like a Pascal’s Mugging has ~zero expected utility as far as the wager itself goes, plus that derived from curiosity and a desire to study logical problems. I believe that Counterargument #5 can be tuned and expanded to apply to all such muggings, so I’ll be writing that up in a bit :)
p(I am god) = 0 is simpler and gets the job done
Assuming Bayesian probability, p=0 means “I refuse to consider new evidence”, which is contrary to the goal of “bullshit, prove it” (I suspect that p=1/infinity might have practically the same issue unless dealing with a god who can provide infinite bits of evidence; fortunately in this case you are making exactly that claim :))
Yes, I forgot to mention that if I’m a god I can stop time while I’m flipping coins.
This falls back to 3b, then: My utility function isn’t calibrated to a universe where you can ignore physics. Furthermore, it also falls back to 1b: Once we assume physics doesn’t apply, we get an infinite number of theories to choose from, all with equal likelihood, so once again why select your theory out of that chaos?
How would this work in general? How could you plan for landing on the moon if it hasn’t been done before?
p(moon landing) = 0.
p(I will enjoy trying despite the inevitable failure) > 0.
p(I will feel bad if I ignore the math saying this IS possible) > 0.
p(People who did the moon landing had different priors) > 0.
etc.
It’s not elegant, but it occurred to me as a seed of a thought, and I should have a more robust version in a little bit :)
Dealing with ordinary things has a positive expected utility. Analysing anything that looks like a Pascal’s Mugging has ~zero expected utility as far as the wager itself goes, plus that derived from curiosity and a desire to study logical problems.
I agree with your conclusion, but don’t follow the reasoning. Can you say more about how you identify something that looks like a Pascal’s Mugging?
If something looks like a Pascal’s Mugging when it involves ridiculously large utilities, then maybe you agree with me that you should have bounded utilities.
This falls back to 3b, then: My utility function isn’t calibrated to a universe where you can ignore physics.
The laws of physics are discovered, not known a-priori, so you can’t really use that as a way to make decisions.
Furthermore, it also falls back to 1b: Once we assume physics doesn’t apply, we get an infinite number of theories to choose from, all with equal likelihood
Not equal likelihood. Universal Prior, Solmonoff induction.
so once again why select your theory out of that chaos?
Once you have chaos, you have a problem. Selecting my theory over the others is only an issue for me if I want to collect money, but the chaos is a problem for you even if you don’t select my theory. You’ll end up being jerked around by some other unlikely god.
It’s not elegant, but it occurred to me as a seed of a thought, and I should have a more robust version in a little bit
I’ll be interested to read about it. Good luck. I hope there’s something there for you to find.
If something looks like a Pascal’s Mugging when it involves ridiculously large utilities, then maybe you agree with me that you should have bounded utilities.
“Pascal’s Mugging” seems to be any scam that involves ridiculously large utilities, and probably specifically those that try to exploit the payoff vs likelihood ratio in that way. A scam is approximately “an assertion that you should give me something, despite a lack of strong evidence supporting my assertion”. So if you offered me $1,000, it’d be just a scam. If you offer me eternal salvation, it’s Pascal’s Mugging.
Yes I do. Dealing with ordinary things has a positive expected utility. Analysing anything that looks like a Pascal’s Mugging has ~zero expected utility as far as the wager itself goes, plus that derived from curiosity and a desire to study logical problems. I believe that Counterargument #5 can be tuned and expanded to apply to all such muggings, so I’ll be writing that up in a bit :)
Assuming Bayesian probability, p=0 means “I refuse to consider new evidence”, which is contrary to the goal of “bullshit, prove it” (I suspect that p=1/infinity might have practically the same issue unless dealing with a god who can provide infinite bits of evidence; fortunately in this case you are making exactly that claim :))
This falls back to 3b, then: My utility function isn’t calibrated to a universe where you can ignore physics. Furthermore, it also falls back to 1b: Once we assume physics doesn’t apply, we get an infinite number of theories to choose from, all with equal likelihood, so once again why select your theory out of that chaos?
p(moon landing) = 0. p(I will enjoy trying despite the inevitable failure) > 0. p(I will feel bad if I ignore the math saying this IS possible) > 0. p(People who did the moon landing had different priors) > 0. etc.
It’s not elegant, but it occurred to me as a seed of a thought, and I should have a more robust version in a little bit :)
I agree with your conclusion, but don’t follow the reasoning. Can you say more about how you identify something that looks like a Pascal’s Mugging?
If something looks like a Pascal’s Mugging when it involves ridiculously large utilities, then maybe you agree with me that you should have bounded utilities.
The laws of physics are discovered, not known a-priori, so you can’t really use that as a way to make decisions.
Not equal likelihood. Universal Prior, Solmonoff induction.
Once you have chaos, you have a problem. Selecting my theory over the others is only an issue for me if I want to collect money, but the chaos is a problem for you even if you don’t select my theory. You’ll end up being jerked around by some other unlikely god.
I’ll be interested to read about it. Good luck. I hope there’s something there for you to find.
“Pascal’s Mugging” seems to be any scam that involves ridiculously large utilities, and probably specifically those that try to exploit the payoff vs likelihood ratio in that way. A scam is approximately “an assertion that you should give me something, despite a lack of strong evidence supporting my assertion”. So if you offered me $1,000, it’d be just a scam. If you offer me eternal salvation, it’s Pascal’s Mugging.