I’m also not sure we need a new term. But spelling out exactly what you mean in every statement gets cumbersome. I hate jargon, but there’s a reason for new terms for new concepts.
The issue I care about isn’t what AGI can do now; it’s what it can and will do in the future. If it keeps helping people design things, with no agency (goals) of its own, that’s great. It could go wrong, but that’s a subtle argument. My point is that we need a term to distinguish AI that just gives answers, like “how could this city be designed better”, from AI with goals like “design a better city”. That kind is the one we’re really worried about. Because designing the very best city implies using the most compute to do it, and getting the most compute might also imply keeping humans from interfering with your plans.
If we could ensure that AGI never has its own goals, I think most of the confusion and fear would and should die down. As it is, we’re mixing important concerns about agentic AGI with less clear and less terrifying concerns about non-agentic, tool or “oracle” AGI.
I’m also not sure we need a new term. But spelling out exactly what you mean in every statement gets cumbersome. I hate jargon, but there’s a reason for new terms for new concepts.
The issue I care about isn’t what AGI can do now; it’s what it can and will do in the future. If it keeps helping people design things, with no agency (goals) of its own, that’s great. It could go wrong, but that’s a subtle argument. My point is that we need a term to distinguish AI that just gives answers, like “how could this city be designed better”, from AI with goals like “design a better city”. That kind is the one we’re really worried about. Because designing the very best city implies using the most compute to do it, and getting the most compute might also imply keeping humans from interfering with your plans.
If we could ensure that AGI never has its own goals, I think most of the confusion and fear would and should die down. As it is, we’re mixing important concerns about agentic AGI with less clear and less terrifying concerns about non-agentic, tool or “oracle” AGI.