Responding to your other comment: Probably AI labs will be nationalized, yes, as models reach capabilities levels to be weapons in themselves.
The one here: a “toothy international agreement” to me sounds indistinguishable from the “world government and nukes are banned” world that makes rational sense from the 1950s but has not happened for 73 years.
Why would it happen this time? In your world model, are you imagining that the “world government” outcome was always a non negligible probability and the dice roll could go this way?
Or do you think that a world that let countries threaten humanity and every living person in a city with doomsday nuclear arsenals would consider total human extinction a more serious threat than nukes, and people would come together to an agreement?
Or do you think the underlying technology or sociological structure of the world has changed in a way that allows world governments now, but didn’t then?
I genuinely don’t know how you are reaching these conclusions. Do you see my perspective? Countless forces between human groups create trends, and those trends are the history and economics we know. To expect a different result requires the underlying rules to have shifted.
A world government seems much more plausible to me in a world where the only surviving fraction of humanity is huddled in terror in the few remaining underground bunkers belonging to a single nation.
Note: I don’t advocate for this world outcome, but I do see it as a likely outcome in the worlds where strong international cooperation fails.
Responding to your other comment: Probably AI labs will be nationalized, yes, as models reach capabilities levels to be weapons in themselves.
The one here: a “toothy international agreement” to me sounds indistinguishable from the “world government and nukes are banned” world that makes rational sense from the 1950s but has not happened for 73 years.
Why would it happen this time? In your world model, are you imagining that the “world government” outcome was always a non negligible probability and the dice roll could go this way?
Or do you think that a world that let countries threaten humanity and every living person in a city with doomsday nuclear arsenals would consider total human extinction a more serious threat than nukes, and people would come together to an agreement?
Or do you think the underlying technology or sociological structure of the world has changed in a way that allows world governments now, but didn’t then?
I genuinely don’t know how you are reaching these conclusions. Do you see my perspective? Countless forces between human groups create trends, and those trends are the history and economics we know. To expect a different result requires the underlying rules to have shifted.
A world government seems much more plausible to me in a world where the only surviving fraction of humanity is huddled in terror in the few remaining underground bunkers belonging to a single nation.
Note: I don’t advocate for this world outcome, but I do see it as a likely outcome in the worlds where strong international cooperation fails.