The question is how popular the war was with the soldiers, who had the weapons. The home front’s opinion gets relatively less important in the face of enough angry people with guns.
similarly chaotic conditions could come to exist in many Western countries as well, perhaps as the result of a world war or economic transformation driven by AI
Or climate change. True. But I honestly still expect right wing authoritarianism to emerge victorious from most of those scenarios. The leftist front just doesn’t have enough unity or enough of a project to turn even its worst impulses into actual policy. I think the only liberal-ish leaning regime I can imagine emerging is more of a moderate technocracy enforced by an alliance between politics and techno-capitalists bolstered in power by AI.
And of course, since we’re here, anyone who created an aligned ASI first would have a shot at shaping the world as they see fit, so that’s the ultimate pivot upon which even a very extreme minority might impose its views on everyone else. I wrote a whole post about this once. But those are very extreme scenarios (and I expect that if such an ASI is possible it’d just kill us, most likely).
The question is how popular the war was with the soldiers, who had the weapons. The home front’s opinion gets relatively less important in the face of enough angry people with guns.
Or climate change. True. But I honestly still expect right wing authoritarianism to emerge victorious from most of those scenarios. The leftist front just doesn’t have enough unity or enough of a project to turn even its worst impulses into actual policy. I think the only liberal-ish leaning regime I can imagine emerging is more of a moderate technocracy enforced by an alliance between politics and techno-capitalists bolstered in power by AI.
And of course, since we’re here, anyone who created an aligned ASI first would have a shot at shaping the world as they see fit, so that’s the ultimate pivot upon which even a very extreme minority might impose its views on everyone else. I wrote a whole post about this once. But those are very extreme scenarios (and I expect that if such an ASI is possible it’d just kill us, most likely).