...it doesn’t matter how slow humans are at AGI research...
It matters, because if we only go as far as the moon, if you forgive me the space exploration metaphor, and then need thousands of years to reach the next star system, humans will adapt to cope with the long journey.
It seems pretty challenging to envisage humans “adapting” to the existence of superintelligent machines on any realistic timescale—unless you mean finding a way to upload their essences into cyberspace.
It looks like he meant something like, “if it takes 10,000 years to get to AI, then other changes like biological modification, singleton formation, cultural/values drift, stochastic risk of civilization-collapsing war, etc, are the most important areas for affecting humanity’s future.”
It seems pretty challenging to envisage humans “adapting” to the existence of superintelligent machines on any realistic timescale—unless you mean finding a way to upload their essences into cyberspace.
It looks like he meant something like, “if it takes 10,000 years to get to AI, then other changes like biological modification, singleton formation, cultural/values drift, stochastic risk of civilization-collapsing war, etc, are the most important areas for affecting humanity’s future.”