I definitely don’t find centralization inevitable. I have argued that the US government will very likely take control of AGI projects before they’re transformative. But I don’t think they’ll centralize them. Soft Nationalization: How the US Government Will Control AI Labs lists many legal ways the government could exert pressure and control on AGI labs. I think that still severely underestimates the potential for government control without nationalization. The government can and has simply exerted emergency powers in extreme situations. Developing AGI, properly understood, is definitely an extreme situation. If that were somehow ruled an executive overreach, congress can simply pass new laws. And prior to or failing all of that, the government can, probably will, and might already have some federal agent just show up and say “we just need to understand what’s happening and how important decisions are being made; nothing formal, we don’t want to have to take over and cause you trouble and a slowdown, so just keep us in the loop and there won’t be a problem”.
Taking control of matters of extreme national importance is the government’s job. It will do that as soon as it gets its collective head around how immense a deal AGI will be.
However, I don’t think they’ll centralize AGI, for two reasons: John Wentworth is very likely correct that it would slow down progress, probably a lot. Beauracracy does that. Second, the incoming administration believes this, whether or not it’s true.
A “manhattan project” would probably be soft government involvement, and just throwing more money into the race dynamics. That’s what would get us to AGI fastest.
However, see AE Studios arguments and evidence for conservative lawmakers being actually pretty receptive to x-risk arguments. But I have a hard time imagining Trump either being that cautiously inclined, even if he does believe in the risks to some degree; or keeping his meaty little fingers off of what is starting to look like maybe the most important project of our time.
So unfortunately I think the answer pretty clearly no, not during a Trump presidency.
“The government can and has simply exerted emergency powers in extreme situations. Developing AGI, properly understood, is definitely an extreme situation. If that were somehow ruled an executive overreach, congress can simply pass new laws.”
-> How likely do you think it is that there’s clear consensus on AGI being an extreme situation/at want point in the trajectory? I definitely agree that If there were consensus the USG would take action. But I’m kind of worried things will be messy and unclear and different groups will have different narratives etc
Here’s a separate comment for a separate point:
I definitely don’t find centralization inevitable. I have argued that the US government will very likely take control of AGI projects before they’re transformative. But I don’t think they’ll centralize them. Soft Nationalization: How the US Government Will Control AI Labs lists many legal ways the government could exert pressure and control on AGI labs. I think that still severely underestimates the potential for government control without nationalization. The government can and has simply exerted emergency powers in extreme situations. Developing AGI, properly understood, is definitely an extreme situation. If that were somehow ruled an executive overreach, congress can simply pass new laws. And prior to or failing all of that, the government can, probably will, and might already have some federal agent just show up and say “we just need to understand what’s happening and how important decisions are being made; nothing formal, we don’t want to have to take over and cause you trouble and a slowdown, so just keep us in the loop and there won’t be a problem”.
Taking control of matters of extreme national importance is the government’s job. It will do that as soon as it gets its collective head around how immense a deal AGI will be.
However, I don’t think they’ll centralize AGI, for two reasons: John Wentworth is very likely correct that it would slow down progress, probably a lot. Beauracracy does that. Second, the incoming administration believes this, whether or not it’s true.
A “manhattan project” would probably be soft government involvement, and just throwing more money into the race dynamics. That’s what would get us to AGI fastest.
However, see AE Studios arguments and evidence for conservative lawmakers being actually pretty receptive to x-risk arguments. But I have a hard time imagining Trump either being that cautiously inclined, even if he does believe in the risks to some degree; or keeping his meaty little fingers off of what is starting to look like maybe the most important project of our time.
So unfortunately I think the answer pretty clearly no, not during a Trump presidency.
“The government can and has simply exerted emergency powers in extreme situations. Developing AGI, properly understood, is definitely an extreme situation. If that were somehow ruled an executive overreach, congress can simply pass new laws.”
-> How likely do you think it is that there’s clear consensus on AGI being an extreme situation/at want point in the trajectory? I definitely agree that If there were consensus the USG would take action. But I’m kind of worried things will be messy and unclear and different groups will have different narratives etc