It should either specify that if Omega predicts the human will use that kind of entropy then it gets a “Fuck you” (gets nothing in the big box, or worse) or, at best, that Omega awards that kind of randomization with a proportional payoff (ie. If behavior is determined by a fair coin then the big box contains half the money.)
Or that Omega is smart enough to predict any randomizer you have available.
The FAQ states that omega has/is a computer the size of the moon—that’s huge but finite. I believe its possible, with today’s technology, to create a randomizer that an omega of this size cannot predict. However smart omega is, one can always create a randomizer that omega cannot break.
Or that Omega is smart enough to predict any randomizer you have available.
The FAQ states that omega has/is a computer the size of the moon—that’s huge but finite. I believe its possible, with today’s technology, to create a randomizer that an omega of this size cannot predict. However smart omega is, one can always create a randomizer that omega cannot break.
True, but just because such a randomizer is theoretically possible doesn’t mean you have one to hand.