I agree. Self-replicating nanotech seems to be likely a much harder problem than for language models to get good enough actors to get political, cultural, and economic power.
To the extent that an AGI can make political and economic decisions that are of higher quality than human decisions, there’s also a lot of pressure for humans to delegate those decisions to AGI. Organizations that delegate those decisions to AGI will outcompete those who don’t.
I agree. Self-replicating nanotech seems to be likely a much harder problem than for language models to get good enough actors to get political, cultural, and economic power.
To the extent that an AGI can make political and economic decisions that are of higher quality than human decisions, there’s also a lot of pressure for humans to delegate those decisions to AGI. Organizations that delegate those decisions to AGI will outcompete those who don’t.