After thinking about this for while, I think this is one of the better plans out there. But I still think that it has one huge, glaring issue.
Imagine that a group of people which controls AGI wants to keep controlling it. Given that it prevents any other AGIs from appearing, it controls ~100% of Earth’s resources.
From the group’s perspective would it really be safer to give 10% of its resources (that can later be used to disrupt your control over AGI). We basically have 2 scenarios here:
You control 100% of resources but other people mad at you
You control 90% of resources and you don’t know if other people are mad at you or not, but now they control 10% of resources.
Do you think scenario 2 is safer for a group that controls AGI? Because the aforementioned paretopian plan hinges on 2 being safer than 1.
Wow, that seems really promising (thank you for the link!). I can envision one potential problem with the plan, though. It relies on the assumption that giving away 10% of the resources is the safest strategy for whoever controls AGI. But could it be that the group who controls AGI still lives in the “us vs them” mindset and decides that giving away 10% of the resources is actually a riskier strategy, because it would give the opposing side more resources to potentially take away the control over AGI?
I’m very uncertain and feel somewhat out of depth on this. I do have quite some hope though from arguments like those in https://aiprospects.substack.com/p/paretotopian-goal-alignment.
After thinking about this for while, I think this is one of the better plans out there. But I still think that it has one huge, glaring issue.
Imagine that a group of people which controls AGI wants to keep controlling it. Given that it prevents any other AGIs from appearing, it controls ~100% of Earth’s resources.
From the group’s perspective would it really be safer to give 10% of its resources (that can later be used to disrupt your control over AGI). We basically have 2 scenarios here:
You control 100% of resources but other people mad at you
You control 90% of resources and you don’t know if other people are mad at you or not, but now they control 10% of resources.
Do you think scenario 2 is safer for a group that controls AGI? Because the aforementioned paretopian plan hinges on 2 being safer than 1.
Wow, that seems really promising (thank you for the link!). I can envision one potential problem with the plan, though. It relies on the assumption that giving away 10% of the resources is the safest strategy for whoever controls AGI. But could it be that the group who controls AGI still lives in the “us vs them” mindset and decides that giving away 10% of the resources is actually a riskier strategy, because it would give the opposing side more resources to potentially take away the control over AGI?