A government might model the situation as something like “the first country/coalition to open up an AI capabilities gap of size X versus everyone else wins” because it can then easily win a tech/cultural/memetic/military/economic competition against everyone else and take over the world. (Or a fuzzy version of this to take into account various uncertainties.) Seems like a very different kind of utility function.
A government might model the situation as something like “the first country/coalition to open up an AI capabilities gap of size X versus everyone else wins” because it can then easily win a tech/cultural/memetic/military/economic competition against everyone else and take over the world. (Or a fuzzy version of this to take into account various uncertainties.) Seems like a very different kind of utility function.