I think a lot of thinking around multipolar scenarios suffers from heuristic “solution in the shape of the problem”, i.e. “multipolar scenario is when we have kinda aligned AI, but still die due to coordination failures, therefore, solution for multipolar scenarios should be about coordination”.
I think the correct solution is to leverage available superintelligence in nice unilateral way:
D/acc—use superintelligence to put as much defence as you can, starting from formal software verification and ending in spreading biodefence nanotech;
Running away—if you set up Moon/Mars/Jovian colony of nanotech-upgraded humans/uploads and pour available resources into defence, even if Earth explodes, humanity as a species survives.
I think a lot of thinking around multipolar scenarios suffers from heuristic “solution in the shape of the problem”, i.e. “multipolar scenario is when we have kinda aligned AI, but still die due to coordination failures, therefore, solution for multipolar scenarios should be about coordination”.
I think the correct solution is to leverage available superintelligence in nice unilateral way:
D/acc—use superintelligence to put as much defence as you can, starting from formal software verification and ending in spreading biodefence nanotech;
Running away—if you set up Moon/Mars/Jovian colony of nanotech-upgraded humans/uploads and pour available resources into defence, even if Earth explodes, humanity as a species survives.