If the thing you say is true, superintelligence will just build specialized narrow superintelligences for particular tasks, just like how we build machines. It doesn’t leave us much chance for trade.
The system has an absolute advantage wrt civilisation at the task of developing specialised systems for any task
The system also has a comparative advantage
I think #1 is dubious for attainable strongly superhuman general intelligences, and #2 is likely nonsense.
I think #2 only sounds not nonsense if you ignore all economic constraints.
I think the problem is defining superintelligence as a thing that’s “efficient wrt human civilisation on all cognitive tasks of economic importance”, when my objection is: “that thing you have defined may not be something that is actually physically possible. Attainable strongly superhuman general intelligences are not the thing that you have defined”.
Like you can round off my position to “certain definitions of superintelligence just seem prima facie infeasible/unattainable to me” without losing much nuance.
I actually can’t imagine any subtask of “turning the world into paperclips” where humanity can have any comparative advantage, can you give an example?
If the thing you say is true, superintelligence will just build specialized narrow superintelligences for particular tasks, just like how we build machines. It doesn’t leave us much chance for trade.
This also presupposes that:
The system has an absolute advantage wrt civilisation at the task of developing specialised systems for any task
The system also has a comparative advantage
I think #1 is dubious for attainable strongly superhuman general intelligences, and #2 is likely nonsense.
I think #2 only sounds not nonsense if you ignore all economic constraints.
I think the problem is defining superintelligence as a thing that’s “efficient wrt human civilisation on all cognitive tasks of economic importance”, when my objection is: “that thing you have defined may not be something that is actually physically possible. Attainable strongly superhuman general intelligences are not the thing that you have defined”.
Like you can round off my position to “certain definitions of superintelligence just seem prima facie infeasible/unattainable to me” without losing much nuance.
I actually can’t imagine any subtask of “turning the world into paperclips” where humanity can have any comparative advantage, can you give an example?