Some entertain the hypothesis that the K-complexity of the entire universe may be under 1000 bits—though nobody knows if that is true or not.
More can be said about a single apple than about all the apples in the world.
Handing me an entire universe and saying “here you go, there is a superintelligence in there somewhere, find it yourself” does not qualify as giving me a superintelligence (in the information sense). In fact I need the same amount of information to find a superintelligence in the “1,000 bit universe” as the minimum K-complexity of the superintelligence itself.
The K-complexity of the entire universe is basically irrelevant.
In fact I need the same amount of information to find a superintelligence in the “1,000 bit universe” as the minimum K-complexity of the superintelligence itself.
Well, perhaps minus 1,000 bits. Those 1,000 bits might be screening off a lot of universes with no superintelligences in them, so they could matter a great deal. If so, that’s a smaller space to search—by a factor of 2^1,000.
A description of a superintelligence might be quite easy to find in our universe; for example, if the entire universe were tiled with AIs for most of its lifetime, it may not require much information to point to one of them once you have described the universe. If the Kolmogorov complexity of the universe were really only 1000, this could easily beat cousn_it’s suggestion.
More can be said about a single apple than about all the apples in the world.
Handing me an entire universe and saying “here you go, there is a superintelligence in there somewhere, find it yourself” does not qualify as giving me a superintelligence (in the information sense). In fact I need the same amount of information to find a superintelligence in the “1,000 bit universe” as the minimum K-complexity of the superintelligence itself.
The K-complexity of the entire universe is basically irrelevant.
Well, perhaps minus 1,000 bits. Those 1,000 bits might be screening off a lot of universes with no superintelligences in them, so they could matter a great deal. If so, that’s a smaller space to search—by a factor of 2^1,000.
A description of a superintelligence might be quite easy to find in our universe; for example, if the entire universe were tiled with AIs for most of its lifetime, it may not require much information to point to one of them once you have described the universe. If the Kolmogorov complexity of the universe were really only 1000, this could easily beat cousn_it’s suggestion.
This is an upper bound not a lower bound. That roughly means that whenever you don’t have proof to the contrary you assume the worst.
What? Cousin_it gave an upper bound. I (and timtyler) are pointing out a plausible way the value could be significantly below that upper bound.
It is true that the grandparent does not fit this context. Mistake!