Superintelligence is a broad overview of the topic without any aspirations for rigor, as far as I can tell, and it is pretty clear about that.
who simply assume that an AI will instantly gain any capability up to and including things which would require more energy than there exists in the universe or more computational power than would be available from turning every atom in the universe into computronium.
This seems uncharitable. The outside view certainly backs up something like
every jump in intelligence opens up new previously unexpected and unimaginable sources of energy.
Examples: fire, fossil fuels, nuclear. Same applies to computational power.
There is no clear reason for this correlation to disappear. Thus what we would currently deem
more energy than there exists in the universe
or
more computational power than would be available from turning every atom in the universe into computronium
might reflect our limited understanding of the Universe, rather than any kind of genuine limits.
Superintelligence is a broad overview of the topic without any aspirations for rigor, as far as I can tell, and it is pretty clear about that.
This seems uncharitable. The outside view certainly backs up something like
Examples: fire, fossil fuels, nuclear. Same applies to computational power.
There is no clear reason for this correlation to disappear. Thus what we would currently deem
or
might reflect our limited understanding of the Universe, rather than any kind of genuine limits.