I have no reason to select you as a likely god candidate, compared to the ~infinite number of people who exist across all of space-time and all Everett branches.
Agreed. However, you also have no reason to carry on your business dealing with ordinary things rather than focusing exclusively on the various unlikely gods that might be trying to jerk you around. I don’t win, but you lose.
2b) Even if you flip coins incredibly fast and in parallel, I will still eventually die, so we can only count the number of coin flips that happen before then.
Yes, I forgot to mention that if I’m a god I can stop time while I’m flipping coins.
Assume a utility function which is finite but unbounded. It cannot handle infinity, and thus your mugging relies on an invalid input (infinite utility), and is discarded as malformed.
If you play by those rules, you can’t assign a utility to the infinite gamble, so you can’t make decisions about it. If the infinite gamble is possible, your utility function is failing to do its job, which is to help you make decisions. Tell me how you want to fix that without bounded utility.
my mind assigns p(you are god) = “bullshit, prove it”, and this is about the closest I can come to expressing that mathematically
p(I am god) = 0 is simpler and gets the job done. That appears to be more restrictive than the Universal Prior—I think the universal prior would give positive probability to me being god. There might be a general solution here to specifying a prior that doesn’t fall into these pits, but I don’t know what it is. Do you?
Assign probabilities by frequency of occurrence. There have been no instances of god yet, so p(god) = 0. Once god has been demonstrated, I can update off of this 0, unlike with Bayesian statistics.
How would this work in general? How could you plan for landing on the moon if it hasn’t been done before? You need to distinguish “failure is certain because we put a large bomb in the rocket that will blow up before it gets anywhere” from “failure is certain because it hasn’t been done before and thus p(success) = 0″.
you also have no reason to carry on your business dealing with ordinary things
Yes I do. Dealing with ordinary things has a positive expected utility. Analysing anything that looks like a Pascal’s Mugging has ~zero expected utility as far as the wager itself goes, plus that derived from curiosity and a desire to study logical problems. I believe that Counterargument #5 can be tuned and expanded to apply to all such muggings, so I’ll be writing that up in a bit :)
p(I am god) = 0 is simpler and gets the job done
Assuming Bayesian probability, p=0 means “I refuse to consider new evidence”, which is contrary to the goal of “bullshit, prove it” (I suspect that p=1/infinity might have practically the same issue unless dealing with a god who can provide infinite bits of evidence; fortunately in this case you are making exactly that claim :))
Yes, I forgot to mention that if I’m a god I can stop time while I’m flipping coins.
This falls back to 3b, then: My utility function isn’t calibrated to a universe where you can ignore physics. Furthermore, it also falls back to 1b: Once we assume physics doesn’t apply, we get an infinite number of theories to choose from, all with equal likelihood, so once again why select your theory out of that chaos?
How would this work in general? How could you plan for landing on the moon if it hasn’t been done before?
p(moon landing) = 0.
p(I will enjoy trying despite the inevitable failure) > 0.
p(I will feel bad if I ignore the math saying this IS possible) > 0.
p(People who did the moon landing had different priors) > 0.
etc.
It’s not elegant, but it occurred to me as a seed of a thought, and I should have a more robust version in a little bit :)
Dealing with ordinary things has a positive expected utility. Analysing anything that looks like a Pascal’s Mugging has ~zero expected utility as far as the wager itself goes, plus that derived from curiosity and a desire to study logical problems.
I agree with your conclusion, but don’t follow the reasoning. Can you say more about how you identify something that looks like a Pascal’s Mugging?
If something looks like a Pascal’s Mugging when it involves ridiculously large utilities, then maybe you agree with me that you should have bounded utilities.
This falls back to 3b, then: My utility function isn’t calibrated to a universe where you can ignore physics.
The laws of physics are discovered, not known a-priori, so you can’t really use that as a way to make decisions.
Furthermore, it also falls back to 1b: Once we assume physics doesn’t apply, we get an infinite number of theories to choose from, all with equal likelihood
Not equal likelihood. Universal Prior, Solmonoff induction.
so once again why select your theory out of that chaos?
Once you have chaos, you have a problem. Selecting my theory over the others is only an issue for me if I want to collect money, but the chaos is a problem for you even if you don’t select my theory. You’ll end up being jerked around by some other unlikely god.
It’s not elegant, but it occurred to me as a seed of a thought, and I should have a more robust version in a little bit
I’ll be interested to read about it. Good luck. I hope there’s something there for you to find.
If something looks like a Pascal’s Mugging when it involves ridiculously large utilities, then maybe you agree with me that you should have bounded utilities.
“Pascal’s Mugging” seems to be any scam that involves ridiculously large utilities, and probably specifically those that try to exploit the payoff vs likelihood ratio in that way. A scam is approximately “an assertion that you should give me something, despite a lack of strong evidence supporting my assertion”. So if you offered me $1,000, it’d be just a scam. If you offer me eternal salvation, it’s Pascal’s Mugging.
Agreed. However, you also have no reason to carry on your business dealing with ordinary things rather than focusing exclusively on the various unlikely gods that might be trying to jerk you around. I don’t win, but you lose.
Yes, I forgot to mention that if I’m a god I can stop time while I’m flipping coins.
If you play by those rules, you can’t assign a utility to the infinite gamble, so you can’t make decisions about it. If the infinite gamble is possible, your utility function is failing to do its job, which is to help you make decisions. Tell me how you want to fix that without bounded utility.
p(I am god) = 0 is simpler and gets the job done. That appears to be more restrictive than the Universal Prior—I think the universal prior would give positive probability to me being god. There might be a general solution here to specifying a prior that doesn’t fall into these pits, but I don’t know what it is. Do you?
How would this work in general? How could you plan for landing on the moon if it hasn’t been done before? You need to distinguish “failure is certain because we put a large bomb in the rocket that will blow up before it gets anywhere” from “failure is certain because it hasn’t been done before and thus p(success) = 0″.
Yes I do. Dealing with ordinary things has a positive expected utility. Analysing anything that looks like a Pascal’s Mugging has ~zero expected utility as far as the wager itself goes, plus that derived from curiosity and a desire to study logical problems. I believe that Counterargument #5 can be tuned and expanded to apply to all such muggings, so I’ll be writing that up in a bit :)
Assuming Bayesian probability, p=0 means “I refuse to consider new evidence”, which is contrary to the goal of “bullshit, prove it” (I suspect that p=1/infinity might have practically the same issue unless dealing with a god who can provide infinite bits of evidence; fortunately in this case you are making exactly that claim :))
This falls back to 3b, then: My utility function isn’t calibrated to a universe where you can ignore physics. Furthermore, it also falls back to 1b: Once we assume physics doesn’t apply, we get an infinite number of theories to choose from, all with equal likelihood, so once again why select your theory out of that chaos?
p(moon landing) = 0. p(I will enjoy trying despite the inevitable failure) > 0. p(I will feel bad if I ignore the math saying this IS possible) > 0. p(People who did the moon landing had different priors) > 0. etc.
It’s not elegant, but it occurred to me as a seed of a thought, and I should have a more robust version in a little bit :)
I agree with your conclusion, but don’t follow the reasoning. Can you say more about how you identify something that looks like a Pascal’s Mugging?
If something looks like a Pascal’s Mugging when it involves ridiculously large utilities, then maybe you agree with me that you should have bounded utilities.
The laws of physics are discovered, not known a-priori, so you can’t really use that as a way to make decisions.
Not equal likelihood. Universal Prior, Solmonoff induction.
Once you have chaos, you have a problem. Selecting my theory over the others is only an issue for me if I want to collect money, but the chaos is a problem for you even if you don’t select my theory. You’ll end up being jerked around by some other unlikely god.
I’ll be interested to read about it. Good luck. I hope there’s something there for you to find.
“Pascal’s Mugging” seems to be any scam that involves ridiculously large utilities, and probably specifically those that try to exploit the payoff vs likelihood ratio in that way. A scam is approximately “an assertion that you should give me something, despite a lack of strong evidence supporting my assertion”. So if you offered me $1,000, it’d be just a scam. If you offer me eternal salvation, it’s Pascal’s Mugging.