if you believe in short timelines to superintelligence
Due to serial speed advantage of AIs, superintelligence is unnecessary for making humanity irrelevant within a few years of the first AGIs capable of autonomous unbounded research. Conversely, without such AGI, the impact on society is going to remain bounded, not overturning everything.
Due to serial speed advantage of AIs, superintelligence is unnecessary for making humanity irrelevant within a few years of the first AGIs capable of autonomous unbounded research. Conversely, without such AGI, the impact on society is going to remain bounded, not overturning everything.