That does beg the question of “efficient for what?”, though. If we discount the idea that nonsapient agents are likely to end up as better fully general optimizers than sapient ones (and thus destined to inherit the universe whether we like it or not), we’re left with a mess of sapient agents that presumably want to further their own interests. They could potentially outcompete themselves in limited domains by introducing subsapient category optimizers, but in most cases I can’t think of a reason why they’d want to.
Is Blindsight some kind of cool science fiction thing that I should know about?
Reproducing and taking resources from other agents are enough to cause the apocalypse. If sapients band together to notice and eliminate resource-takers, they can prevent this sort of apocalypse. To do so they will essentially create a singleton, finally ending evolution / banishing the blind idiot God.
Shades of Blindsight there.
That does beg the question of “efficient for what?”, though. If we discount the idea that nonsapient agents are likely to end up as better fully general optimizers than sapient ones (and thus destined to inherit the universe whether we like it or not), we’re left with a mess of sapient agents that presumably want to further their own interests. They could potentially outcompete themselves in limited domains by introducing subsapient category optimizers, but in most cases I can’t think of a reason why they’d want to.
Is Blindsight some kind of cool science fiction thing that I should know about?
Reproducing and taking resources from other agents are enough to cause the apocalypse. If sapients band together to notice and eliminate resource-takers, they can prevent this sort of apocalypse. To do so they will essentially create a singleton, finally ending evolution / banishing the blind idiot God.
Ah, sorry. Novel by Peter Watts, available here. Touches on some of the issues you introduced.