I was hoping for Eliezer’s answer. If you have an answer, I’d advise posting it separately.
As for your answer, suppose it’s more likely that he’ll torture 3^^^^3 people if you give him the money. Now you can’t give him the money. Now he’s just Pascal mugging you into not giving him money. It’s the same principle.
Also, the same principle could be done in infinitely many ways. I’m sure there’s some way in which it can make the correct choice to be one you wouldn’t have done.
It’s not at all the same. This is not a problem invoking Omega. If you want that go to the lifespan dilemma.
If we know Omega has 3^^^^3 sided dice and will kill the people if it lands on the one, then I’d shut up and calculate.
Pascal’s wager involves much more uncertainty than that. It involves uncertainty about the character speaking. Once a being is claiming it has magic and wants you to do something, to the extent one believes the magic part, one loses one’s base of reference to judge the being as truthful, non-whimsical, etc.
Are you arguing that he’s more likely to torture them if you give him the money, that the probabilities are the same to within one part in 3^^^^3, or that since it’s not a dice, probability works fundamentally differently?
My response was assuming the first. The second one is ridiculous, and I don’t think anyone would consider that if it weren’t for the bias of giving round numbers for probabilities. If it’s the third one, I’d suggest reading probability is in the mind. You don’t know which side the die will land on, this is no different than not knowing what kind of a person the character is.
suppose it’s more likely that he’ll torture 3^^^^3 people if you give him the money
That’s a different problem than Pascal’s Wager. Taking it back to the original, it would be like saying “Convert to Christianity pro forma for a chance at heaven rather than no chance of heaven, ignoring all other magical options.” The problem with this isn’t the quantities of utility involved, it’s the assumption that a god who cares about such conversions to Christianity is the only option for a divine, rather than a God of Islam who would burn Christian converts hotter than atheists, or a PC Christian god who would have a heaven for all who were honest with themsleves and didn’t go through pro forma conversions. The answer to the wager is that the random assumption that all forms of magic but one have less probability than that one story about magic is a dumb assumption.
It’s fine to consider Pascal’s Wager*, where Pascal’s Wager* is under the assumption that our interlocutor is trustworthy, but that’s a different problem and is well articulated as the lifespan dilemma, which is legitimately posed as a separate problem.
As probability is in the mind, when I ask “what would a magical being of infinite power be doing if it asked me for something in a context where it was disguised as a probably not magical being?” My best guess is that it is a test with small consequences, and I can’t distinguish between the chances of “It’s serious” and “It’s a sadistic being who will do the opposite of what it said.”
The problem with this isn’t the quantities of utility involved, it’s the assumption that a god who cares about such conversions to Christianity is the only option for a divine, rather than a God of Islam who would burn Christian converts hotter than atheists, or a PC Christian god who would have a heaven for all who were honest with themsleves and didn’t go through pro forma conversions.
Each of these possibilities has some probability associated with it. Taking them all into account, what is the expected utility of being a Christian? One may ignore those to make the question simpler, but unless all the possibilities cancel out nicely, you’re still going to end up with something.
The answer to the wager is that the random assumption that all forms of magic but one have less probability than that one story about magic is a dumb assumption.
Perhaps no one outweighs all the rest, but if you add them all together, they’d point in one general direction. It’s so close to zero that if you tried to calculate it, you’d barely be able to do better than chance. You’d still be able to do better, though.
I think there is a significant chance you are right, but that it is less than .5. I hope others can add to this discussion. I am reminded of this, if you tell me I am seeing an actual banana that I am holding, rather than an image my brain made of a collection of atoms, then...I don’t even know anymore.
I was hoping for Eliezer’s answer. If you have an answer, I’d advise posting it separately.
As for your answer, suppose it’s more likely that he’ll torture 3^^^^3 people if you give him the money. Now you can’t give him the money. Now he’s just Pascal mugging you into not giving him money. It’s the same principle.
Also, the same principle could be done in infinitely many ways. I’m sure there’s some way in which it can make the correct choice to be one you wouldn’t have done.
It’s not at all the same. This is not a problem invoking Omega. If you want that go to the lifespan dilemma.
If we know Omega has 3^^^^3 sided dice and will kill the people if it lands on the one, then I’d shut up and calculate.
Pascal’s wager involves much more uncertainty than that. It involves uncertainty about the character speaking. Once a being is claiming it has magic and wants you to do something, to the extent one believes the magic part, one loses one’s base of reference to judge the being as truthful, non-whimsical, etc.
Are you arguing that he’s more likely to torture them if you give him the money, that the probabilities are the same to within one part in 3^^^^3, or that since it’s not a dice, probability works fundamentally differently?
My response was assuming the first. The second one is ridiculous, and I don’t think anyone would consider that if it weren’t for the bias of giving round numbers for probabilities. If it’s the third one, I’d suggest reading probability is in the mind. You don’t know which side the die will land on, this is no different than not knowing what kind of a person the character is.
That’s a different problem than Pascal’s Wager. Taking it back to the original, it would be like saying “Convert to Christianity pro forma for a chance at heaven rather than no chance of heaven, ignoring all other magical options.” The problem with this isn’t the quantities of utility involved, it’s the assumption that a god who cares about such conversions to Christianity is the only option for a divine, rather than a God of Islam who would burn Christian converts hotter than atheists, or a PC Christian god who would have a heaven for all who were honest with themsleves and didn’t go through pro forma conversions. The answer to the wager is that the random assumption that all forms of magic but one have less probability than that one story about magic is a dumb assumption.
It’s fine to consider Pascal’s Wager*, where Pascal’s Wager* is under the assumption that our interlocutor is trustworthy, but that’s a different problem and is well articulated as the lifespan dilemma, which is legitimately posed as a separate problem.
As probability is in the mind, when I ask “what would a magical being of infinite power be doing if it asked me for something in a context where it was disguised as a probably not magical being?” My best guess is that it is a test with small consequences, and I can’t distinguish between the chances of “It’s serious” and “It’s a sadistic being who will do the opposite of what it said.”
Each of these possibilities has some probability associated with it. Taking them all into account, what is the expected utility of being a Christian? One may ignore those to make the question simpler, but unless all the possibilities cancel out nicely, you’re still going to end up with something.
Perhaps no one outweighs all the rest, but if you add them all together, they’d point in one general direction. It’s so close to zero that if you tried to calculate it, you’d barely be able to do better than chance. You’d still be able to do better, though.
I think there is a significant chance you are right, but that it is less than .5. I hope others can add to this discussion. I am reminded of this, if you tell me I am seeing an actual banana that I am holding, rather than an image my brain made of a collection of atoms, then...I don’t even know anymore.