“The government can and has simply exerted emergency powers in extreme situations. Developing AGI, properly understood, is definitely an extreme situation. If that were somehow ruled an executive overreach, congress can simply pass new laws.”
-> How likely do you think it is that there’s clear consensus on AGI being an extreme situation/at want point in the trajectory? I definitely agree that If there were consensus the USG would take action. But I’m kind of worried things will be messy and unclear and different groups will have different narratives etc
I think the question isn’t whether but when. AGI most obviously is a huge national security opportunity and risk. The closer we get to it, the more evidence there will be.and the more we talk about it, the more attention will be devoted to it by the national security apparatus.
The likely path to takeoff is relatively slow and continuous. People will get to talk to fully human-level entities before they’re smart enough to take over. Those people will recognize the potential of a new intelligent species in a visceral way that abstract arguments don’t provide.
“The government can and has simply exerted emergency powers in extreme situations. Developing AGI, properly understood, is definitely an extreme situation. If that were somehow ruled an executive overreach, congress can simply pass new laws.”
-> How likely do you think it is that there’s clear consensus on AGI being an extreme situation/at want point in the trajectory? I definitely agree that If there were consensus the USG would take action. But I’m kind of worried things will be messy and unclear and different groups will have different narratives etc
I think the question isn’t whether but when. AGI most obviously is a huge national security opportunity and risk. The closer we get to it, the more evidence there will be.and the more we talk about it, the more attention will be devoted to it by the national security apparatus.
The likely path to takeoff is relatively slow and continuous. People will get to talk to fully human-level entities before they’re smart enough to take over. Those people will recognize the potential of a new intelligent species in a visceral way that abstract arguments don’t provide.