Wow, that seems really promising (thank you for the link!). I can envision one potential problem with the plan, though. It relies on the assumption that giving away 10% of the resources is the safest strategy for whoever controls AGI. But could it be that the group who controls AGI still lives in the “us vs them” mindset and decides that giving away 10% of the resources is actually a riskier strategy, because it would give the opposing side more resources to potentially take away the control over AGI?
I’m very uncertain and feel somewhat out of depth on this. I do have quite some hope though from arguments like those in https://aiprospects.substack.com/p/paretotopian-goal-alignment.
Wow, that seems really promising (thank you for the link!). I can envision one potential problem with the plan, though. It relies on the assumption that giving away 10% of the resources is the safest strategy for whoever controls AGI. But could it be that the group who controls AGI still lives in the “us vs them” mindset and decides that giving away 10% of the resources is actually a riskier strategy, because it would give the opposing side more resources to potentially take away the control over AGI?