I agree. The world could be at a higher risk of conflict just before or after the first ASI is created. Though even if there is a fast takeoff, the risk is still there before the takeoff if it is obvious that an ASI is about to be created.
This scenario is described in quite a lot of detail in chapter 5 of Superintelligence:
“Given the extreme security implications of superintelligence, governments would likely seek to nationalize any project on their territory that they thought close to achieving a takeoff. A powerful state might also attempt to acquire projects located in other countries through espionage, theft, kidnapping, bribery, threats, military conquest, or any other available means.”
I agree. The world could be at a higher risk of conflict just before or after the first ASI is created. Though even if there is a fast takeoff, the risk is still there before the takeoff if it is obvious that an ASI is about to be created.
This scenario is described in quite a lot of detail in chapter 5 of Superintelligence: