Solomonoff induction halves the probability of each hypothesis based on how many additional bits it takes to describe.
See, I told everyone that people here say this.
Fake muggings with large numbers are more profitable to the mugger than fake muggings with small numbers because the fake mugging with the larger number is more likely to convince a naive rationalist. And the profitability depends on the size of the number, not the number of bits in the number. Which makes the likelihood of a large number being fake grow faster than the number of bits in the number.
You are solving the specific problem of the mugger, and not the general problem of tiny bets with huge rewards.
Regardless, there’s no way the probability decreases faster than the reward the mugger promises. I don’t think you can assign 1/3^^^3 probability to anything. That’s an unfathomably small probability. You are literally saying there is no amount of evidence the mugger could give you to convince you otherwise. Even if he showed you his matrix powers, and the computer simulation of 3^^^3 people, you still wouldn’t believe him.
You probably couldn’t verify it. There’s always the possibility that any evidence you see is made up. For all you know you are just in a computer simulation and the entire thing is virtual.
I’m just saying he can show you evidence which increases the probability. Show you the racks of servers, show you the computer system, explain the physics that allows it, lets you do the experiments that shows those physics are correct. You could solve any NP complete problem on the computer. And you could run programs that take known numbers of steps to compute. Like actually calculating 3^^^3, etc.
Sure. But I think there are generally going to be more parsimonious explanations than any that involve him having the power to torture 3^^^3 people, let alone having that power and caring about whether I give him some money.
Parsimonious, sure. The possibility is very unlikely. But it doesn’t just need to be “very unlikely”, it needs to have smaller than 1/3^^^3 probability.
Sure. But if you have an argument that some guy who shows me apparent magical powers has the power to torture 3^^^3 people with probability substantially over 1/3^^^3, then I bet I can turn it into an argument that anyone, with or without a demonstration of magical powers, with or without any sort of claim that they have such powers, has the power to torture 3^^^3 people with probability nearly as substantially over 1/3^^^3. Because surely for anyone under any circumstances, Pr(I experience what seems to be a convincing demonstration that they have such powers) is much larger than 1/3^^^3, whether they actually have such powers or not.
Sure. But if you have an argument that some guy who shows me apparent magical powers has the power to torture 3^^^3 people with probability substantially over 1/3^^^3, then I bet I can turn it into an argument that anyone, with or without a demonstration of magical powers, with or without any sort of claim that they have such powers, has the power to torture 3^^^3 people with probability nearly as substantially over 1/3^^^3.
Correct. That still doesn’t solve the decision theory problem, it makes it worse. Since you have to take into account the possibility that anyone you meet might have the power to torture (or reward with utopia) 3^^^3 people.
It makes it worse or better, depending on whether you decide (1) that everyone has the power to do that with probability >~ 1/3^^^3 or (2) that no one has. I think #2 rather than #1 is correct.
I don’t see what your point is. Yes that’s a small number. It’s not a feeling, that’s just math. If you are assigning things 1/3^^^3 probability, you are basically saying they are impossible and no amount of evidence could convince you otherwise.
You can do that and be perfectly consistent. If that’s your point I don’t disagree. You can’t argue about priors. We can only agree to disagree, if those are your true priors.
Just remember that reality could always say “WRONG!” and punish you for assigning 0 probability to something. If you don’t want to be wrong, don’t assign 1/3^^^3 probability to things you aren’t 99.9999...% sure absolutely can’t happen.
Basically, human beings do not have an actual prior probability distribution. This should be obvious, since it means assigning a numerical probability to every possible state of affairs. No human being has ever done this, or ever will.
But you have something like a prior, but you build the prior itself based on your experience. At the moment we don’t have a specific number for the probability of the mugging situation coming up, but just think it’s very improbable, so that we don’t expect any evidence to ever come up that would convince us. But if the mugger shows matrix powers, we would change our prior so that the probability of the mugging situation was high enough to be convinced by being shown matrix powers.
You might say that means it was already that high, but it does not mean this, given the objective fact that people do not have real priors.
Maybe humans don’t really have probability distributions. But that doesn’t help us actually build an AI which reproduces the same result. If we had infinite computing power and could do ideal Solomonoff induction, it would pay the mugger.
Though I would argue that humans do have approximate probability functions and approximate priors. We wouldn’t be able to function in a probabilistic world if we didn’t. But it’s not relevant.
But if the mugger shows matrix powers, we would change our prior so that the probability of the mugging situation was high enough to be convinced by being shown matrix powers.
That’s just a regular bayesian probability update! You don’t need to change terminology and call it something different.
At the moment we don’t have a specific number for the probability of the mugging situation coming up, but just think it’s very improbable, so that we don’t expect any evidence to ever come up that would convince us.
That’s fine. I too think the situation is extraordinarily implausible. Even Solomonoff induction would agree with us. The probability that the mugger is real would be something like 1/10^100. Or perhaps the exponent should be orders of magnitude larger than that. That’s small enough that it shouldn’t even remotely register as a plausible hypothesis in your mind. But big enough some amount of evidence could convince you.
You don’t need to posit new models of how probability theory should work. Regular probability works fine at assigning really implausible hypotheses really low probability.
See, I told everyone that people here say this.
Fake muggings with large numbers are more profitable to the mugger than fake muggings with small numbers because the fake mugging with the larger number is more likely to convince a naive rationalist. And the profitability depends on the size of the number, not the number of bits in the number. Which makes the likelihood of a large number being fake grow faster than the number of bits in the number.
You are solving the specific problem of the mugger, and not the general problem of tiny bets with huge rewards.
Regardless, there’s no way the probability decreases faster than the reward the mugger promises. I don’t think you can assign 1/3^^^3 probability to anything. That’s an unfathomably small probability. You are literally saying there is no amount of evidence the mugger could give you to convince you otherwise. Even if he showed you his matrix powers, and the computer simulation of 3^^^3 people, you still wouldn’t believe him.
How could he show you “the computer simulation of 3^^^3 people”? What could you do to verify that 3^^^3 people were really being simulated?
You probably couldn’t verify it. There’s always the possibility that any evidence you see is made up. For all you know you are just in a computer simulation and the entire thing is virtual.
I’m just saying he can show you evidence which increases the probability. Show you the racks of servers, show you the computer system, explain the physics that allows it, lets you do the experiments that shows those physics are correct. You could solve any NP complete problem on the computer. And you could run programs that take known numbers of steps to compute. Like actually calculating 3^^^3, etc.
Sure. But I think there are generally going to be more parsimonious explanations than any that involve him having the power to torture 3^^^3 people, let alone having that power and caring about whether I give him some money.
Parsimonious, sure. The possibility is very unlikely. But it doesn’t just need to be “very unlikely”, it needs to have smaller than 1/3^^^3 probability.
Sure. But if you have an argument that some guy who shows me apparent magical powers has the power to torture 3^^^3 people with probability substantially over 1/3^^^3, then I bet I can turn it into an argument that anyone, with or without a demonstration of magical powers, with or without any sort of claim that they have such powers, has the power to torture 3^^^3 people with probability nearly as substantially over 1/3^^^3. Because surely for anyone under any circumstances, Pr(I experience what seems to be a convincing demonstration that they have such powers) is much larger than 1/3^^^3, whether they actually have such powers or not.
Correct. That still doesn’t solve the decision theory problem, it makes it worse. Since you have to take into account the possibility that anyone you meet might have the power to torture (or reward with utopia) 3^^^3 people.
It makes it worse or better, depending on whether you decide (1) that everyone has the power to do that with probability >~ 1/3^^^3 or (2) that no one has. I think #2 rather than #1 is correct.
Well, doing basic Bayes with a Kolmogorov priot gives you (1).
About as unfathomably small as the number of 3^^^3 people is unfathomably large?
I think you’re relying on “but I feel this can’t be right!” a bit too much.
I don’t see what your point is. Yes that’s a small number. It’s not a feeling, that’s just math. If you are assigning things 1/3^^^3 probability, you are basically saying they are impossible and no amount of evidence could convince you otherwise.
You can do that and be perfectly consistent. If that’s your point I don’t disagree. You can’t argue about priors. We can only agree to disagree, if those are your true priors.
Just remember that reality could always say “WRONG!” and punish you for assigning 0 probability to something. If you don’t want to be wrong, don’t assign 1/3^^^3 probability to things you aren’t 99.9999...% sure absolutely can’t happen.
Eliezer showed a problem that that reasoning in his post on Pascal’s Muggle.
Basically, human beings do not have an actual prior probability distribution. This should be obvious, since it means assigning a numerical probability to every possible state of affairs. No human being has ever done this, or ever will.
But you have something like a prior, but you build the prior itself based on your experience. At the moment we don’t have a specific number for the probability of the mugging situation coming up, but just think it’s very improbable, so that we don’t expect any evidence to ever come up that would convince us. But if the mugger shows matrix powers, we would change our prior so that the probability of the mugging situation was high enough to be convinced by being shown matrix powers.
You might say that means it was already that high, but it does not mean this, given the objective fact that people do not have real priors.
Maybe humans don’t really have probability distributions. But that doesn’t help us actually build an AI which reproduces the same result. If we had infinite computing power and could do ideal Solomonoff induction, it would pay the mugger.
Though I would argue that humans do have approximate probability functions and approximate priors. We wouldn’t be able to function in a probabilistic world if we didn’t. But it’s not relevant.
That’s just a regular bayesian probability update! You don’t need to change terminology and call it something different.
That’s fine. I too think the situation is extraordinarily implausible. Even Solomonoff induction would agree with us. The probability that the mugger is real would be something like 1/10^100. Or perhaps the exponent should be orders of magnitude larger than that. That’s small enough that it shouldn’t even remotely register as a plausible hypothesis in your mind. But big enough some amount of evidence could convince you.
You don’t need to posit new models of how probability theory should work. Regular probability works fine at assigning really implausible hypotheses really low probability.
But that is still way, way bigger than 1/3^^^3.