Unless you have a reason to believe that there is some constraint on what numbers could be used—if only a limited number of digits will fit on the bead, for example—your probability for each integer has to be infinitesimal.
You’re not allowed to do that. With a countably infinite set, your only option for priors that assign everything a number between 0 and 1 is to take a summable infinite series. (Exponential distributions, like that proposed by Peter above, are the most elegant for certain questions, but you can do p(n)=cn^{-2} or something else if you prefer to have slower decay of probabilities.)
In the case with colors rather than integers, a good prior on “first bead color, named in a form acceptable to Omega” would correspond to this: take this sort of distribution, starting with the most salient color names and working out from there, but being sure not to exceed 1 in total.
Of course, this is before Omega asks you anything. You then have to have some prior on Omega’s motivations, with respect to which you can update your initial prior when ve asks “Is it red?” And yes, you’ll be very metauncertain about both these priors… but you’ve got to pick something.
The point is that your probability for the “first” integers will not be infinitesimal. If you think that drops off too quickly, instead of 2 use 1+e or something. p(n) = e/(e+2) * (1+e)^(-|n|). And replace n with s(n) if you don’t like that ordering of integers. But regardless, there’s some N for which there is an n with |n|N such that p(n)/p(m) >> 1.
I wasn’t talking about limiting frequencies, so don’t ask me “how often?”
Would you bet $1 billion against my $1 that no number with absolute value smaller than 3^^^3 will come up? If not then you shouldn’t be assigning infinitesimal probability to those numbers.
I get the feeling that I am thinking about this incorrectly but am missing a key point. If someone out there can see it, please let me know.
I wasn’t talking about limiting frequencies, so don’t ask me “how often?”
Sorry.
If the set of possible options is all integers and Omega asks about a particular integer, why would the probability go up the smaller the number gets?
Would you bet $1 billion against my $1 that no number with absolute value smaller than 3^^^3 will come up? If not then you shouldn’t be assigning infinitesimal probability to those numbers.
Betting on ranges seems like a no brainer to me. If Omega comes and asks you to pick an integer and then asks me to bet on whether an object pulled from the jar will have an absolute value over or under that integer, I should always bet that the number will be higher than yours.
If I had a random number generator that could theoretically pull a random number from all integers, it seems weird to assume it will be small. As far as I know, such a random number generator is impossible. Assuming it is impossible, there must be a cap somewhere in the set of all integers. The catch is that we have no idea where this cap is. If you can write 3^^^3 I can write 3^^^3 + 1 which leads me to believe that no matter what number you pick, the cap will be significantly higher. As long as I can cover the costs of the bet, I should bet against you.
The math works like this:
Given the option to place $X against your $Y
That when you pick an integer Z
Omega will pull a number out of the jar that is greater than the absolute value of Z,
There is a way to express X / Y * Z + 1 and,
Assuming I am placing equal probabilities on each possible integer between 0 and X / Y * Z + 1,
I should always take the bet
A trivial example: If I bet $5 against your $1 and you pick the integer 100, I can easily imagine the number 501. 501⁄100 is greater than 5⁄1. I should take the bet.
The problem seems to be that I am placing equal probabilities on each possible integer while you favor numbers closer to 0. Favoring numbers like 1 or 2 makes a lot of sense if someone came up to me on the street with a bucket of balls with numbers printed on them. I would also consider the chances of pulling 1 to be much higher than 3^^^3.
So, perhaps, my misstep is thinking of Omega’s challenge as a purely theoretical puzzle and not associating it with the real world. In any case, I certainly do not want to give the impression that I think 3^^^3 is just as likely to appear as 42 in the real world. Of course, in the real world I wouldn’t bet on anything at all because I do not consider the information available to be useful in determining the correct action and I am ridiculously risk averse.
The problem seems to be that I am placing equal probabilities on each possible integer while you favor numbers closer to 0.
You are not doing so, since it is impossible. No such probability distribution exists. In fact you recognize this by saying there’s a cap somewhere out there, you just don’t know where. Well, this cap means that small numbers (smaller than the cap) have much, much higher probability (i.e. nonzero) than large numbers (those higher than the cap have zero probability).
Maybe this will serve as an intuition pump: suppose you’ve narrowed down your cap to just a few numbers. In fact, just N and 2N. You’ve given them each equal weight. Well, now p(1) = (1/N + 1/2N)/2 = 3⁄4 N, but p(N+k) = 1⁄4 N, and p(2N+k) = 0. The probability goes down as numbers get larger. Determine your priors over all the caps, compute the resulting distribution, and you’ll find p(n) eventually start to decrease.
Unless you have a reason to believe that there is some constraint on what numbers could be used—if only a limited number of digits will fit on the bead, for example—your probability for each integer has to be infinitesimal.
You’re not allowed to do that. With a countably infinite set, your only option for priors that assign everything a number between 0 and 1 is to take a summable infinite series. (Exponential distributions, like that proposed by Peter above, are the most elegant for certain questions, but you can do p(n)=cn^{-2} or something else if you prefer to have slower decay of probabilities.)
In the case with colors rather than integers, a good prior on “first bead color, named in a form acceptable to Omega” would correspond to this: take this sort of distribution, starting with the most salient color names and working out from there, but being sure not to exceed 1 in total.
Of course, this is before Omega asks you anything. You then have to have some prior on Omega’s motivations, with respect to which you can update your initial prior when ve asks “Is it red?” And yes, you’ll be very metauncertain about both these priors… but you’ve got to pick something.
I am happy with that explanation. Thanks.
Why not, say, p(n) = (1/3) * 2^(-|n|)?
If p(n) = (1/3) * 2^(-|n|), then:
p(1) = (1/3) * 2^(-1) = 0.166666667
p(86) = (1 / 3) * (2^(-86)) = 4.30823236 × 10^(-27)
p(1 000 000) = (1/3) * 2^(-1 000 000) = Lower than Google’s calculator lets me go
Are you willing to bet that 1 is going to happen that much more often than 1,000,000?
The point is that your probability for the “first” integers will not be infinitesimal. If you think that drops off too quickly, instead of 2 use 1+e or something. p(n) = e/(e+2) * (1+e)^(-|n|). And replace n with s(n) if you don’t like that ordering of integers. But regardless, there’s some N for which there is an n with |n|N such that p(n)/p(m) >> 1.
I wasn’t talking about limiting frequencies, so don’t ask me “how often?”
Would you bet $1 billion against my $1 that no number with absolute value smaller than 3^^^3 will come up? If not then you shouldn’t be assigning infinitesimal probability to those numbers.
I get the feeling that I am thinking about this incorrectly but am missing a key point. If someone out there can see it, please let me know.
Sorry.
If the set of possible options is all integers and Omega asks about a particular integer, why would the probability go up the smaller the number gets?
Betting on ranges seems like a no brainer to me. If Omega comes and asks you to pick an integer and then asks me to bet on whether an object pulled from the jar will have an absolute value over or under that integer, I should always bet that the number will be higher than yours.
If I had a random number generator that could theoretically pull a random number from all integers, it seems weird to assume it will be small. As far as I know, such a random number generator is impossible. Assuming it is impossible, there must be a cap somewhere in the set of all integers. The catch is that we have no idea where this cap is. If you can write 3^^^3 I can write 3^^^3 + 1 which leads me to believe that no matter what number you pick, the cap will be significantly higher. As long as I can cover the costs of the bet, I should bet against you.
The math works like this:
Given the option to place $X against your $Y
That when you pick an integer Z
Omega will pull a number out of the jar that is greater than the absolute value of Z,
There is a way to express X / Y * Z + 1 and,
Assuming I am placing equal probabilities on each possible integer between 0 and X / Y * Z + 1,
I should always take the bet
A trivial example: If I bet $5 against your $1 and you pick the integer 100, I can easily imagine the number 501. 501⁄100 is greater than 5⁄1. I should take the bet.
The problem seems to be that I am placing equal probabilities on each possible integer while you favor numbers closer to 0. Favoring numbers like 1 or 2 makes a lot of sense if someone came up to me on the street with a bucket of balls with numbers printed on them. I would also consider the chances of pulling 1 to be much higher than 3^^^3.
So, perhaps, my misstep is thinking of Omega’s challenge as a purely theoretical puzzle and not associating it with the real world. In any case, I certainly do not want to give the impression that I think 3^^^3 is just as likely to appear as 42 in the real world. Of course, in the real world I wouldn’t bet on anything at all because I do not consider the information available to be useful in determining the correct action and I am ridiculously risk averse.
To dispel this confusion, you should read on algorithmic information theory.
Is there a good place to start online? Can I just Google “algorithmic information theory”?
Google first and ask questions later. ;-)
You are not doing so, since it is impossible. No such probability distribution exists. In fact you recognize this by saying there’s a cap somewhere out there, you just don’t know where. Well, this cap means that small numbers (smaller than the cap) have much, much higher probability (i.e. nonzero) than large numbers (those higher than the cap have zero probability).
Maybe this will serve as an intuition pump: suppose you’ve narrowed down your cap to just a few numbers. In fact, just N and 2N. You’ve given them each equal weight. Well, now p(1) = (1/N + 1/2N)/2 = 3⁄4 N, but p(N+k) = 1⁄4 N, and p(2N+k) = 0. The probability goes down as numbers get larger. Determine your priors over all the caps, compute the resulting distribution, and you’ll find p(n) eventually start to decrease.