Looking at the selection pressures for ontologies and abstractions, there is a bunch of pressures which are fairly universal, an in various ways apply to humans, AIs, animals...
For example: Negentropy is costly ⇒ flipping less bits and storing less bits is selected for; consequences include -part of concepts; clustering is compression -discretization/quantization/coarse grainings; all is compression …
Intentional stance is to a decent extent ~compression algorithm assuming some systems can be decomposed into “goals” and “executor” (now the cat is chasing a mouse, now some other mouse). Yes this is again not the full explanation because it leads to a question why there are systems in the territory for which this works, but it is a step.
I have a longer draft on this, but my current take is the high level answer to the question is similar for crabs and ontologies (&more).
Convergent evolution usually happens because of similar selection pressures + some deeper contingencies.
Looking at the selection pressures for ontologies and abstractions, there is a bunch of pressures which are fairly universal, an in various ways apply to humans, AIs, animals...
For example: Negentropy is costly ⇒ flipping less bits and storing less bits is selected for; consequences include
-part of concepts; clustering is compression
-discretization/quantization/coarse grainings; all is compression
…
Intentional stance is to a decent extent ~compression algorithm assuming some systems can be decomposed into “goals” and “executor” (now the cat is chasing a mouse, now some other mouse). Yes this is again not the full explanation because it leads to a question why there are systems in the territory for which this works, but it is a step.