It looks like he meant something like, “if it takes 10,000 years to get to AI, then other changes like biological modification, singleton formation, cultural/values drift, stochastic risk of civilization-collapsing war, etc, are the most important areas for affecting humanity’s future.”
It looks like he meant something like, “if it takes 10,000 years to get to AI, then other changes like biological modification, singleton formation, cultural/values drift, stochastic risk of civilization-collapsing war, etc, are the most important areas for affecting humanity’s future.”