Yeah, it is if you completely ignore the unique and defining feature of all Pascal’s mugging, the conditionality of the reward on your assessed probability… ಠ_ಠ
I don’t understand this. In the original Pascal’s wager, it is suggested that you become Christian, start going through the motions, and this will eventually change your belief so that you think God’s existence is likely. But this is not a feature of the generalized Pascal’s mugging, at least not as described on the wiki page.
(Also, given that this is the Stupid Questions Thread, I feel your comment would be improved by more explanation and less disapproving emoticons...)
In the original Pascal’s wager, it is suggested that you become Christian, start going through the motions, and this will eventually change your belief so that you think God’s existence is likely.
No, in the original Pascal’s wager you are advised to believe in God, as God would judge you based on your beliefs (i.e. your assessed probability of existence). However, that doesn’t seem to be the form of the Pascal’s mugging, which is also discussed quite a bit on this site. The conditionality of reward or punishment on subjective probability estimates doesn’t seem to be the point at which decision theories break down, but rather they seem to break down with very small probabilities of very large effects.
Actually, Pascal did advise “going through the motions” as a solution to being unable to simply will oneself into belief. The wager might not be strong apologetics, but I give Pascal some credit for his grasp of cognitive dissonance.
Yes, this is what I was trying to say. I see how the phrase “conditionality of the reward on your assessed probability” could describe Pascal’s Wager, but not how it could describe Pascal’s Mugging.
More concisely than the original/gwern: The algorithm used by the mugger is roughly:
Find your assessed probability of the mugger being able to deliver whatever reward, being careful to specify the size of the reward in the conditions for the probability
offer an exchange such that U(payment to mugger) < U(reward) * P(reward)
This is an issue for AI design because if you use a prior based on Kolmogorov complexity than it’s relatively straightforward to find such a reward, because even very large numbers have relatively low complexity, and therefore relatively high prior probabilities.
When you have a bunch of other data, you should be not interested in the Kolmogorov complexity of the number, you are interested in Kolmogorov complexity of other data concatenated with that number.
E.g. you should not assign higher probability that Bill Gates has made precisely 100 000 000 000 $ than some random-looking value, as given the other sensory input you got (from which you derived your world model) there are random-looking values that have even lower Kolmogorov complexity of total sensory input, but you wouldn’t be able to find those because Kolmogorov complexity is uncomputable. You end up mis-estimating Kolmogorov complexity when you don’t have it given to you on a platter pre-made.
Actually, what you should use is algorithmic (Solomonoff) probability, like AIXI does, on the history of sensory input, to weighted sum among the world models that present you with the marketing spiel of the mugger. The shortest ones simply have the mugger make it up, then there will be the models where mugger will torture beings if you pay and not torture if you don’t, it’s unclear what’s going to happen out of this and how it will pan out, because, again, uncomputable.
In the human approximation, you take what mugger says as privileged model, which is strictly speaking an invalid update (the probability jumps from effectively zero for never thinking about it, to nonzero), and the invalid updates come with a cost of being prone to losing money. The construction of model directly from what mugger says the model should be is a hack; at that point anything goes and you can have another hack, of the strategic kind, to not apply this string->model hack to ultra extraordinary claims without evidence.
The mugging is defined as having conditionality; just read Bostrom’s paper or Baumann’s reply! That Eliezer did not explicitly state the mugger’s simple algorithm, but instead implied it in his discussion of complexity and size of numbers, does not obviate this point.
I might as well take a shot at explaining. Pascal’s wager says I might as well take on the relatively small inconvenience of going through the motions of believing in God, because if the small probability event occurs that he does exist, the reward is extremely large or infinite (eternal life in heaven presumably)
Pascal’s mugging instead makes this a relatively small payment ($5 as Yudkowsky phrased it) to avoid or mitigate a minuscule chance that someone may cause a huge amount of harm (putting dust specks in 3^^^^3 people’s eyes or whatever the current version is)
Thus for both of them people are faced with making some small investment to, should an event of minuscule probability occur, vastly increase their utility. A lottery ticket
Yeah, it is if you completely ignore the unique and defining feature of all Pascal’s mugging, the conditionality of the reward on your assessed probability… ಠ_ಠ
I don’t understand this. In the original Pascal’s wager, it is suggested that you become Christian, start going through the motions, and this will eventually change your belief so that you think God’s existence is likely. But this is not a feature of the generalized Pascal’s mugging, at least not as described on the wiki page.
(Also, given that this is the Stupid Questions Thread, I feel your comment would be improved by more explanation and less disapproving emoticons...)
No, in the original Pascal’s wager you are advised to believe in God, as God would judge you based on your beliefs (i.e. your assessed probability of existence). However, that doesn’t seem to be the form of the Pascal’s mugging, which is also discussed quite a bit on this site. The conditionality of reward or punishment on subjective probability estimates doesn’t seem to be the point at which decision theories break down, but rather they seem to break down with very small probabilities of very large effects.
Actually, Pascal did advise “going through the motions” as a solution to being unable to simply will oneself into belief. The wager might not be strong apologetics, but I give Pascal some credit for his grasp of cognitive dissonance.
I stand corrected.
Yes, this is what I was trying to say. I see how the phrase “conditionality of the reward on your assessed probability” could describe Pascal’s Wager, but not how it could describe Pascal’s Mugging.
More concisely than the original/gwern: The algorithm used by the mugger is roughly:
Find your assessed probability of the mugger being able to deliver whatever reward, being careful to specify the size of the reward in the conditions for the probability
offer an exchange such that U(payment to mugger) < U(reward) * P(reward)
This is an issue for AI design because if you use a prior based on Kolmogorov complexity than it’s relatively straightforward to find such a reward, because even very large numbers have relatively low complexity, and therefore relatively high prior probabilities.
When you have a bunch of other data, you should be not interested in the Kolmogorov complexity of the number, you are interested in Kolmogorov complexity of other data concatenated with that number.
E.g. you should not assign higher probability that Bill Gates has made precisely 100 000 000 000 $ than some random-looking value, as given the other sensory input you got (from which you derived your world model) there are random-looking values that have even lower Kolmogorov complexity of total sensory input, but you wouldn’t be able to find those because Kolmogorov complexity is uncomputable. You end up mis-estimating Kolmogorov complexity when you don’t have it given to you on a platter pre-made.
Actually, what you should use is algorithmic (Solomonoff) probability, like AIXI does, on the history of sensory input, to weighted sum among the world models that present you with the marketing spiel of the mugger. The shortest ones simply have the mugger make it up, then there will be the models where mugger will torture beings if you pay and not torture if you don’t, it’s unclear what’s going to happen out of this and how it will pan out, because, again, uncomputable.
In the human approximation, you take what mugger says as privileged model, which is strictly speaking an invalid update (the probability jumps from effectively zero for never thinking about it, to nonzero), and the invalid updates come with a cost of being prone to losing money. The construction of model directly from what mugger says the model should be is a hack; at that point anything goes and you can have another hack, of the strategic kind, to not apply this string->model hack to ultra extraordinary claims without evidence.
edit: i meant, weighted sum, not ‘select’.
The mugging is defined as having conditionality; just read Bostrom’s paper or Baumann’s reply! That Eliezer did not explicitly state the mugger’s simple algorithm, but instead implied it in his discussion of complexity and size of numbers, does not obviate this point.
That’s too bad. If you had asked some questions, perhaps someone could have helped you understand.
‘I don’t understand this’ usually means ‘Would somebody please explain?’.
I might as well take a shot at explaining. Pascal’s wager says I might as well take on the relatively small inconvenience of going through the motions of believing in God, because if the small probability event occurs that he does exist, the reward is extremely large or infinite (eternal life in heaven presumably)
Pascal’s mugging instead makes this a relatively small payment ($5 as Yudkowsky phrased it) to avoid or mitigate a minuscule chance that someone may cause a huge amount of harm (putting dust specks in 3^^^^3 people’s eyes or whatever the current version is)
Thus for both of them people are faced with making some small investment to, should an event of minuscule probability occur, vastly increase their utility. A lottery ticket