To clarify: we are not running any programs, just providing code. In a sense, we are competing at the task of providing descriptions for very large numbers with an upper bound on the size of the description (and the requirement that the description is computable).
One problem is that “every possible RNG call” may be an infinite set. For a really simple example, a binary {0,1} RNG with program “add 1 to your count if you roll 1 and repeat until you roll 0″ has infinitely many possible rolls and no maximum output. It halts with probability 1, though.
If you allow the RNG to be configured for arbitrary distributions then you can have it always return a number from such a distribution in a single call, still with no maximum.
Oops, yeah the written programs are supposed to be deterministic. The point of mentioning the RNG was to handle the fact that an AI might derive its performance from a strong random number generator, which a C code can’t emulate.
To clarify: we are not running any programs, just providing code. In a sense, we are competing at the task of providing descriptions for very large numbers with an upper bound on the size of the description (and the requirement that the description is computable).
Oh, I see that I misread.
One problem is that “every possible RNG call” may be an infinite set. For a really simple example, a binary {0,1} RNG with program “add 1 to your count if you roll 1 and repeat until you roll 0″ has infinitely many possible rolls and no maximum output. It halts with probability 1, though.
If you allow the RNG to be configured for arbitrary distributions then you can have it always return a number from such a distribution in a single call, still with no maximum.
Oops, yeah the written programs are supposed to be deterministic. The point of mentioning the RNG was to handle the fact that an AI might derive its performance from a strong random number generator, which a C code can’t emulate.