Ah, so you meant: No physically possible series of Bayesian updates can promote a hypothesis to prominence if its prior probability is that low. And Peter meant: It is decision-theoretically useless to include a subroutine for tracking probability increments of 1/3^^^^^3 in your algorithm.
But the non-Bayesian source of your Bayesian prior might output 1/3^^^^^3 as the prior probability of an event—as surely for the coin flip example as for Robin Hanson’s anthropic one.
To be precise, it’s impossible to describe any sense event with a prior probability that low. You can describe hypotheses conditional on which a macro-event has a probability that low. For example, conditional on the hypothesis that a coin is fixed to have a 1/3^^^3 probability of coming up heads, the probability of seeing heads is 1/3^^^3. But barring the specific and single case of Hanson’s hypothesized anthropic penalty being rational, I know of no way to describe, in words, any hypothesis which could justly be assigned so low a prior probability as 1/3^^^3. Including the hypothesis that purple is falling upstairs, that my socks are white and not white, or that 2 + 2 = 5 is a consistent theorem of Peano arithmetic.
Coin’s fixed.
Ah, so you meant: No physically possible series of Bayesian updates can promote a hypothesis to prominence if its prior probability is that low. And Peter meant: It is decision-theoretically useless to include a subroutine for tracking probability increments of 1/3^^^^^3 in your algorithm.
But the non-Bayesian source of your Bayesian prior might output 1/3^^^^^3 as the prior probability of an event—as surely for the coin flip example as for Robin Hanson’s anthropic one.
To be precise, it’s impossible to describe any sense event with a prior probability that low. You can describe hypotheses conditional on which a macro-event has a probability that low. For example, conditional on the hypothesis that a coin is fixed to have a 1/3^^^3 probability of coming up heads, the probability of seeing heads is 1/3^^^3. But barring the specific and single case of Hanson’s hypothesized anthropic penalty being rational, I know of no way to describe, in words, any hypothesis which could justly be assigned so low a prior probability as 1/3^^^3. Including the hypothesis that purple is falling upstairs, that my socks are white and not white, or that 2 + 2 = 5 is a consistent theorem of Peano arithmetic.
The log_2(3^^^^^3) consecutive binary digits of pi starting from number 3^^^^^3 are 0?
The simulators are messing with you.
Then our minds are “fixed” too, just like the coin.
How many dustspecks in the eye are you willing to bet on that?