So my question is: What are the ‘obvious’ candidates for limits that take over before the all optimizable is optimized by runaway technology?
There aren’t any that I’m aware of, except for “a disaster happens and everyone dies,” but that’s bad luck, not a hard limit. I would respond with something along the lines of “exponential growth can’t continue forever, but where it levels out has huge implications for what life will look like, and it seems likely it will level out far above our current level, rather than just above our current level.”
There aren’t any that I’m aware of, except for “a disaster happens and everyone dies,” but that’s bad luck, not a hard limit. I would respond with something along the lines of “exponential growth can’t continue forever, but where it levels out has huge implications for what life will look like, and it seems likely it will level out far above our current level, rather than just above our current level.”