It irritates me when people talk as though Pascal’s Mugging or Wager is an obvious fallacy, but then give response which are fallacious themselves, like saying that the probability of the opposite is equal (it is not), or that there are alternative scenarios which are just as likely, and then give much less probable scenarios (e.g. that there is a god that rewards people for being atheist), or say that when you are dealing with infinities it does not matter which one is more probable (it does). You are quite correct that people are just going with their intuition and trying to justify that, and it would be much more honest if people admitted that.
It seems to me that the likely true answer is that there is a limit to how much a human being can care about something, so when you assign an incredibly high utility, this does not correspond to anything you care about it in reality, and so you don’t follow decision theory when this comes up. For example, suppose that you knew with 100% certainty that XiXiDu’s last scenario was true: whenever you say “abracadabra”, it causes a nearly immeasurable amount of utility in the simulations, but of course this never touches your experience. Would you do nothing for the rest of your life except say “abracadabra”? Of course not, no more than you are going to donate all of your money to save children in Africa. This simply does not correctly represent what you care about. This is also the answer to Eliezer’s Lifespan Dilemma. Eliezer does NOT care about an infinite lifespan as much as he says he does, or he would indeed take the deal. Likewise, if he really cared infinitely about eternal life, he would become a Christian (or a member of some other religion promising eternal life) immediately, no matter how low the probability of success. But neither he nor any human being cares infinitely about anything.
I agree that people frequently give fallacious responses, and that the opposite is not equal in probability (it may be much higher). I disagree with roughly everything else in the parent. In particular, “god” is not a natural category. By this I mean that if we assume a way to get eternal life exists, the (conditional) probability of any religion that makes this promise still seems vanishingly small—much smaller than the chance of us screwing up the hypothetical opportunity by deliberately adopting an irrational belief.
This does not necessarily mean that we can introduce infinite utility and it will add up to normality. But e.g. we observe the real Eliezer taking unusual actions which could put him in a good position to live forever if the possibility exists.
I agree with basically all of this.
It irritates me when people talk as though Pascal’s Mugging or Wager is an obvious fallacy, but then give response which are fallacious themselves, like saying that the probability of the opposite is equal (it is not), or that there are alternative scenarios which are just as likely, and then give much less probable scenarios (e.g. that there is a god that rewards people for being atheist), or say that when you are dealing with infinities it does not matter which one is more probable (it does). You are quite correct that people are just going with their intuition and trying to justify that, and it would be much more honest if people admitted that.
It seems to me that the likely true answer is that there is a limit to how much a human being can care about something, so when you assign an incredibly high utility, this does not correspond to anything you care about it in reality, and so you don’t follow decision theory when this comes up. For example, suppose that you knew with 100% certainty that XiXiDu’s last scenario was true: whenever you say “abracadabra”, it causes a nearly immeasurable amount of utility in the simulations, but of course this never touches your experience. Would you do nothing for the rest of your life except say “abracadabra”? Of course not, no more than you are going to donate all of your money to save children in Africa. This simply does not correctly represent what you care about. This is also the answer to Eliezer’s Lifespan Dilemma. Eliezer does NOT care about an infinite lifespan as much as he says he does, or he would indeed take the deal. Likewise, if he really cared infinitely about eternal life, he would become a Christian (or a member of some other religion promising eternal life) immediately, no matter how low the probability of success. But neither he nor any human being cares infinitely about anything.
I agree that people frequently give fallacious responses, and that the opposite is not equal in probability (it may be much higher). I disagree with roughly everything else in the parent. In particular, “god” is not a natural category. By this I mean that if we assume a way to get eternal life exists, the (conditional) probability of any religion that makes this promise still seems vanishingly small—much smaller than the chance of us screwing up the hypothetical opportunity by deliberately adopting an irrational belief.
This does not necessarily mean that we can introduce infinite utility and it will add up to normality. But e.g. we observe the real Eliezer taking unusual actions which could put him in a good position to live forever if the possibility exists.