Yeah, Vitalik’s “On Collusion” definitely seems relevant (I was going to mention that myself before I saw you add it). And I also had a thought that this ties into Paul’s “strategy-stealing assumption” which basically is an assumption of “end of history”, i.e., that allocation of power will be entrenched.
My takeaway from all this so far is that “history” consists of powerless people gaining power by better coordinating amongst themselves, which often involved ideology (non-epistemic beliefs). My guess is that with the advent of AGI, “history” might look very different, with “better coordinating” looking more like technological advances (e.g., better approximation to utility maximizers who can merge) instead of politics and ideology.
At least one difference from today is that the powerless today at least control their own bodies and labor, and the powerful do not actually have much physical power and instead have to depend on social structures to enforce their power and achieve their goals. So with enough coordination the powerless can simply ignore/overthrow the existing power structures. With AI though (even if intent-aligned), humans who are “powerless” today could become literally powerless.
So we need to muster enough mass popular support that politicians see which way the wind is blowing and switch sides en masse (like they did with gay marriage).”
Sorry, illusion of transparency strikes again here. What I meant by “powerful” in that sentence was “humans” not “politicians”. I’m interested in having a chat about this topic where maybe we can talk more efficiently. Please PM or email me if you’re also interested.
Whereas if you don’t have new ideologies rising and gaining power, then you can go around fixing individual problems all day, but the core allocation of power in society will become so entrenched that the policy distortions are disastrous.
ETA: I would be interested in understanding your perspective here better. Why do you think entrenchment of allocation of power will lead to disaster?
Yeah, Vitalik’s “On Collusion” definitely seems relevant (I was going to mention that myself before I saw you add it). And I also had a thought that this ties into Paul’s “strategy-stealing assumption” which basically is an assumption of “end of history”, i.e., that allocation of power will be entrenched.
My takeaway from all this so far is that “history” consists of powerless people gaining power by better coordinating amongst themselves, which often involved ideology (non-epistemic beliefs). My guess is that with the advent of AGI, “history” might look very different, with “better coordinating” looking more like technological advances (e.g., better approximation to utility maximizers who can merge) instead of politics and ideology.
At least one difference from today is that the powerless today at least control their own bodies and labor, and the powerful do not actually have much physical power and instead have to depend on social structures to enforce their power and achieve their goals. So with enough coordination the powerless can simply ignore/overthrow the existing power structures. With AI though (even if intent-aligned), humans who are “powerless” today could become literally powerless.
Sorry, illusion of transparency strikes again here. What I meant by “powerful” in that sentence was “humans” not “politicians”. I’m interested in having a chat about this topic where maybe we can talk more efficiently. Please PM or email me if you’re also interested.
ETA: I would be interested in understanding your perspective here better. Why do you think entrenchment of allocation of power will lead to disaster?