The number of assertions needed is now so large that it may be difficult for a human to acquire that much knowledge.
Especially given these are likely significantly lower bounds, and don’t account for the problems of running on spotty evolutionary hardware, I suspect that the discrepancy is even greater than it first appears.
What I find intriguing about this result is that essentially it is one of the few I’ve seen that has a limit description of consciousness: you have on one hand a rating of complexity of your “conscious” cognitive system and on the other you have world adherence based on the population of your assertions. Consciousness is maintained if, as you increase your complexity, you maintain the variety of the assertion population.
It is possible that the convergence rates for humans and prospective GAI will simply be different, however. Which makes a certain amount of sense. Ideal consciousness in this model is unachievable, and approaching it faster is more costly, so there are good evolutionary reasons for our brains to be as meagerly conscious as possibly—even to fake consciousness when the resources would not otherwise be missed.
Especially given these are likely significantly lower bounds, and don’t account for the problems of running on spotty evolutionary hardware, I suspect that the discrepancy is even greater than it first appears.
What I find intriguing about this result is that essentially it is one of the few I’ve seen that has a limit description of consciousness: you have on one hand a rating of complexity of your “conscious” cognitive system and on the other you have world adherence based on the population of your assertions. Consciousness is maintained if, as you increase your complexity, you maintain the variety of the assertion population.
It is possible that the convergence rates for humans and prospective GAI will simply be different, however. Which makes a certain amount of sense. Ideal consciousness in this model is unachievable, and approaching it faster is more costly, so there are good evolutionary reasons for our brains to be as meagerly conscious as possibly—even to fake consciousness when the resources would not otherwise be missed.