Maybe you mean that companies that make AI systems and robots could in aggregate just overthrow the government rather than pay taxes?
Something along those lines, but honestly I’d expect it to happen more gradually. My problem is that right now, the current situation rests on the fact that everyone involved needs everyone else, to a point. We’ve arrived at this arrangement through a lot of turbulent history and conflict. Ultimately, for example, a state can’t just… kill the vast majority of its population. It would collapse. That creates a need for even the worst tyrannies to somewhat balance their excesses, if they’re not going completely insane and essentially committing suicide as a polity (this does sometimes happen). Similarly, companies can only get away with so much mistreatment of workers or pollution before either competition, boycotts, or the long arm of the law (backed by politicians who need their constituents’ votes) get them.
But all this balance is the product of an equilibrium of mutual need. Remove the need, the institutions might survive—for a while, out of inertia. But I don’t think it would be a stable situation. Gradually you’d have everyone realise how they can get away with stuff that they couldn’t get away with before and now suffer no consequences for it, or be able to ignore the consequences.
Similarly, there’s no real reason a king ought to have power. The people could just not listen to him, or execute him. And yet...
If you want to describe a monarch as “relying on the benevolence of the butcher” then I guess sure, I see what you mean. But I’m not yet convinced that this is a helpful frame on how power works or a good way to make forecasts.
A democracy, even with zero value for labor, seems much more stable than historical monarchies or dictatorships. There are fewer plausibly legitimate challengers (and less room for a revolt), there is a better mechanism for handling succession disputes. AI also seems likely to generally increases the stability of formal governance (one of the big things people complain about!).
Another way of putting it is that capitalists are also relying on the benevolence of the butcher, at least in the world of today. Their capital doesn’t physically empower them, 99.9% of what they have is title and the expectation that law enforcement will settle disputes in their favor (and that they can pay security, who again has no real reason to listen to them beyond the reason they would listen to a king). Aligned AI systems may increase the importance of formal power, since you can build machines that reliably do what their designer intended rather than relying on humans to do what they said they’d do. But I don’t think that asymmetrically favors the capitalist (who has on-paper control of their assets) over the government (who has on paper control of the military and the power to tax).
Similarly, there’s no real reason a king ought to have power. The people could just not listen to him, or execute him. And yet...
Feudal systems were built on trust. The King had legitimacy with his Lords, who held him as a shared point of reference, someone who would mediate and maintain balance between them. The King had to earn and keep that trust. Kings were ousted or executed when they betrayed that trust. Like, all the time. First that come to my mind would be John Lackland, Charles I, and obviously, most famously, Louis XVI. Feudalism pretty much crumbled once material conditions made it so that it wasn’t necessary nor functional any more, and with it went most kings, or they had to find ways to survive in the new order by changing their role into that of figureheads.
I’m saying building AGI would make the current capitalist democracy obsolete the way industrialization and firearms made feudalism obsolete, and I’m saying the system afterwards wouldn’t be as nice as what we have now.
Another way of putting it is that capitalists are also relying on the benevolence of the butcher, at least in the world of today. Their capital doesn’t physically empower them, 99.9% of what they have is title and the expectation that law enforcement will settle disputes in their favor (and that they can pay security, who again has no real reason to listen to them beyond the reason they would listen to a king).
I think again, the problem here is a balance of risks and trust. No one wants to rock the boat too much, even if rocking the boat might end up benefitting them, because it might also not. It’s why most anti-capitalists who keep pining for a popular revolution are kind of deluding themselves: people wouldn’t just risk their lives while having relative material security for the sake of a possible improvement in their conditions that might also actually just turn out to be a totalitarian nightmare instead. It’s a stupid bet no one would take. Changing conditions would change the risks, and thus the optimal choice. States wouldn’t go against corporations, and corporations wouldn’t go against states, if both are mutually dependent from each other. But both would absolutely screw over the common people completely if they had absolutely nothing to fear or lose from it, which is something that AGI could really cement.
I think if you want to argue that this is a trap with no obvious way out, such that utopian visions are wishful thinking, you’ll probably need a more sophisticated version of the political analysis. I don’t currently think the fact that e.g. labor’s share of income is 60% rather than 0% is the primary reason US democracy doesn’t collapse.
I believe that AGI will have lots of weird effects on the world, just not this particular one. (Also that US democracy is reasonably likely to collapse at some point, just not for this particular reason or in this particular way.)
When we will have AGI, humanity will be collectively a “king” of sorts. I.e. a species that for some reason rules other, strictly superior species. So, it would really help if we’d not have “depose the king” as a strong convergent goal.
I, personally, see the main reason of kings and dictators keeping the power is that kiling/deposing them would lead to collapse of the established order and a new struggle for the power between different parties, with likely worse result for all involved than just letting the king rule.
So, if we will have AIs as many separate sufficiently aligned agent, instead of one “God AI”, then keeping humanity on top will not only match their alignment programming, but also is a guarantie of stability, with alternative being a total AI-vs-AI war.
Ultimately, for example, a state can’t just… kill the vast majority of its population. It would collapse. That creates a need for even the worst tyrannies to somewhat balance their excesses
Unless the economy of the tyranny is mostly based on extracting and selling natural resources, in which case everyone else can be killed without much impact on the economy.
Yeah, there are rather degenerate cases I guess. I was thinking of modern industrialised states with complex economies. Even feudal states with mostly near-subsistence level agriculture peasantry could take a lot of population loss without suffering much (the Black Death depopulated Europe to an insane degree, but society remained fairly functional), but in that case, what was missing was the capability to actually carry out slaughter on an industrial scale. Still, peasant revolt repression could get fairly bloody, and eventually as technology improved some really destructive wars were fought (e.g. the Thirty Years’ War).
Something along those lines, but honestly I’d expect it to happen more gradually. My problem is that right now, the current situation rests on the fact that everyone involved needs everyone else, to a point. We’ve arrived at this arrangement through a lot of turbulent history and conflict. Ultimately, for example, a state can’t just… kill the vast majority of its population. It would collapse. That creates a need for even the worst tyrannies to somewhat balance their excesses, if they’re not going completely insane and essentially committing suicide as a polity (this does sometimes happen). Similarly, companies can only get away with so much mistreatment of workers or pollution before either competition, boycotts, or the long arm of the law (backed by politicians who need their constituents’ votes) get them.
But all this balance is the product of an equilibrium of mutual need. Remove the need, the institutions might survive—for a while, out of inertia. But I don’t think it would be a stable situation. Gradually you’d have everyone realise how they can get away with stuff that they couldn’t get away with before and now suffer no consequences for it, or be able to ignore the consequences.
Similarly, there’s no real reason a king ought to have power. The people could just not listen to him, or execute him. And yet...
If you want to describe a monarch as “relying on the benevolence of the butcher” then I guess sure, I see what you mean. But I’m not yet convinced that this is a helpful frame on how power works or a good way to make forecasts.
A democracy, even with zero value for labor, seems much more stable than historical monarchies or dictatorships. There are fewer plausibly legitimate challengers (and less room for a revolt), there is a better mechanism for handling succession disputes. AI also seems likely to generally increases the stability of formal governance (one of the big things people complain about!).
Another way of putting it is that capitalists are also relying on the benevolence of the butcher, at least in the world of today. Their capital doesn’t physically empower them, 99.9% of what they have is title and the expectation that law enforcement will settle disputes in their favor (and that they can pay security, who again has no real reason to listen to them beyond the reason they would listen to a king). Aligned AI systems may increase the importance of formal power, since you can build machines that reliably do what their designer intended rather than relying on humans to do what they said they’d do. But I don’t think that asymmetrically favors the capitalist (who has on-paper control of their assets) over the government (who has on paper control of the military and the power to tax).
Feudal systems were built on trust. The King had legitimacy with his Lords, who held him as a shared point of reference, someone who would mediate and maintain balance between them. The King had to earn and keep that trust. Kings were ousted or executed when they betrayed that trust. Like, all the time. First that come to my mind would be John Lackland, Charles I, and obviously, most famously, Louis XVI. Feudalism pretty much crumbled once material conditions made it so that it wasn’t necessary nor functional any more, and with it went most kings, or they had to find ways to survive in the new order by changing their role into that of figureheads.
I’m saying building AGI would make the current capitalist democracy obsolete the way industrialization and firearms made feudalism obsolete, and I’m saying the system afterwards wouldn’t be as nice as what we have now.
I think again, the problem here is a balance of risks and trust. No one wants to rock the boat too much, even if rocking the boat might end up benefitting them, because it might also not. It’s why most anti-capitalists who keep pining for a popular revolution are kind of deluding themselves: people wouldn’t just risk their lives while having relative material security for the sake of a possible improvement in their conditions that might also actually just turn out to be a totalitarian nightmare instead. It’s a stupid bet no one would take. Changing conditions would change the risks, and thus the optimal choice. States wouldn’t go against corporations, and corporations wouldn’t go against states, if both are mutually dependent from each other. But both would absolutely screw over the common people completely if they had absolutely nothing to fear or lose from it, which is something that AGI could really cement.
I think if you want to argue that this is a trap with no obvious way out, such that utopian visions are wishful thinking, you’ll probably need a more sophisticated version of the political analysis. I don’t currently think the fact that e.g. labor’s share of income is 60% rather than 0% is the primary reason US democracy doesn’t collapse.
I believe that AGI will have lots of weird effects on the world, just not this particular one. (Also that US democracy is reasonably likely to collapse at some point, just not for this particular reason or in this particular way.)
When we will have AGI, humanity will be collectively a “king” of sorts. I.e. a species that for some reason rules other, strictly superior species. So, it would really help if we’d not have “depose the king” as a strong convergent goal.
I, personally, see the main reason of kings and dictators keeping the power is that kiling/deposing them would lead to collapse of the established order and a new struggle for the power between different parties, with likely worse result for all involved than just letting the king rule.
So, if we will have AIs as many separate sufficiently aligned agent, instead of one “God AI”, then keeping humanity on top will not only match their alignment programming, but also is a guarantie of stability, with alternative being a total AI-vs-AI war.
Unless the economy of the tyranny is mostly based on extracting and selling natural resources, in which case everyone else can be killed without much impact on the economy.
Yeah, there are rather degenerate cases I guess. I was thinking of modern industrialised states with complex economies. Even feudal states with mostly near-subsistence level agriculture peasantry could take a lot of population loss without suffering much (the Black Death depopulated Europe to an insane degree, but society remained fairly functional), but in that case, what was missing was the capability to actually carry out slaughter on an industrial scale. Still, peasant revolt repression could get fairly bloody, and eventually as technology improved some really destructive wars were fought (e.g. the Thirty Years’ War).