Thanks for your comment. This is something I should have stated a bit more explicitly.
When I mentioned “single state (or part thereof)”, the part thereof was referring to these groups or groups in other countries that are yet to be formed.
I think the chance of government intervention is quite high in the slow take-off scenario. It’s quite likely that any group successfully working on AGI will slowly but noticeably start to accumulate a lot of resources. If that cannot be concealed, it will start to attract a lot of attention. I think it is unlikely that the government and state bureaucracy would be content to let such resources accumulate untouched e.g. the current shifting attitude to Big Tech in Brussels and Washington.
In a fast take-off scenario, I think we can frame things more provocatively: the group that develops AGI either becomes the government, or the government takes control while it still can. I’m not sure what the relative probabilities are here, but in both circumstances you end up with something that will act like a state, and be treated as a state by other states, which is why I model them like a state in my analysis. For example, even if OpenAI and DeepMind are friendly to each other, and that persists over decades, I can easily imagine the Chinese state trying to develop an alternative that might not be friendly to those two groups, especially if the Chinese government perceive them as promoting a different model of government.
Thanks for your comment. This is something I should have stated a bit more explicitly.
When I mentioned “single state (or part thereof)”, the part thereof was referring to these groups or groups in other countries that are yet to be formed.
I think the chance of government intervention is quite high in the slow take-off scenario. It’s quite likely that any group successfully working on AGI will slowly but noticeably start to accumulate a lot of resources. If that cannot be concealed, it will start to attract a lot of attention. I think it is unlikely that the government and state bureaucracy would be content to let such resources accumulate untouched e.g. the current shifting attitude to Big Tech in Brussels and Washington.
In a fast take-off scenario, I think we can frame things more provocatively: the group that develops AGI either becomes the government, or the government takes control while it still can. I’m not sure what the relative probabilities are here, but in both circumstances you end up with something that will act like a state, and be treated as a state by other states, which is why I model them like a state in my analysis. For example, even if OpenAI and DeepMind are friendly to each other, and that persists over decades, I can easily imagine the Chinese state trying to develop an alternative that might not be friendly to those two groups, especially if the Chinese government perceive them as promoting a different model of government.