So it looks like the Pascal’s mugger problem can be reduced to two problems that need to be solved anyway for an FAI: how to be optimally rational given a finite amount of computing resources, and how to assign probabilities for mathematical statements in a reasonable way.
I’m not sure I agree with that one—where does the question of anthropic priors fit in? The question is how to assign probabilities to physical statements in a reasonable way.
You may be aware of the use of negative probabilities in machine learning and quantum mechanics and, of course, Economics.
For the last, the existence of a Matrix Lord has such a large negative probability that it swamps his proffer (perhaps because it is altruistic?) and no money changes hands. In other words, there is nothing interesting here- it’s just that some type of decision theory haven’t incorporated negative probabilities yet.
The reverse situation- Job’s complaint against God- is more interesting. It shows why variables with negative probabilities tend to disappear out of discourse to be replaced by the difference between two independent ‘normal’ variables- in this case Cosmic Justice is replaced by the I-Thou relationship of ‘God’ & ‘Man’.
Can you give me an example of something with negative probability?
I will offer you a bet: if it doesn’t happen, you have to give me a dollar, but if it does happen, you have to give me everything you own. I find it hard to believe that there’s anything where that’s considered good odds.
For the last, the existence of a Matrix Lord has such a large negative probability that it swamps his proffer (perhaps because it is altruistic?)
If it has such a large negative probability, wouldn’t you try to avoid ever giving someone five dollars, since they anti-might be a Matrix Lord, and you can’t risk a negative probability of them sparing 3^^^3 people?
Also, when you mention quantum mechanics, I think you’re confusing waveform density and probability density. The waveform can be any complex number, but the probability is proportional to the square of the magnitude of the waveform. If the waveform density is 1, −1, i, or -i, the probability of seeing the particle there is the same.
Quantum mechanics actually has lead to some study of negative probabilities, though I’m not familiar with the details. I agree that they don’t come up in the standard sort of QM and that they don’t seem helpful here.
So it looks like the Pascal’s mugger problem can be reduced to two problems that need to be solved anyway for an FAI: how to be optimally rational given a finite amount of computing resources, and how to assign probabilities for mathematical statements in a reasonable way.
Does that sound right?
I’m not sure I agree with that one—where does the question of anthropic priors fit in? The question is how to assign probabilities to physical statements in a reasonable way.
You may be aware of the use of negative probabilities in machine learning and quantum mechanics and, of course, Economics. For the last, the existence of a Matrix Lord has such a large negative probability that it swamps his proffer (perhaps because it is altruistic?) and no money changes hands. In other words, there is nothing interesting here- it’s just that some type of decision theory haven’t incorporated negative probabilities yet. The reverse situation- Job’s complaint against God- is more interesting. It shows why variables with negative probabilities tend to disappear out of discourse to be replaced by the difference between two independent ‘normal’ variables- in this case Cosmic Justice is replaced by the I-Thou relationship of ‘God’ & ‘Man’.
Can you give me an example of something with negative probability?
I will offer you a bet: if it doesn’t happen, you have to give me a dollar, but if it does happen, you have to give me everything you own. I find it hard to believe that there’s anything where that’s considered good odds.
If it has such a large negative probability, wouldn’t you try to avoid ever giving someone five dollars, since they anti-might be a Matrix Lord, and you can’t risk a negative probability of them sparing 3^^^3 people?
Also, when you mention quantum mechanics, I think you’re confusing waveform density and probability density. The waveform can be any complex number, but the probability is proportional to the square of the magnitude of the waveform. If the waveform density is 1, −1, i, or -i, the probability of seeing the particle there is the same.
Quantum mechanics actually has lead to some study of negative probabilities, though I’m not familiar with the details. I agree that they don’t come up in the standard sort of QM and that they don’t seem helpful here.