I have seen this argument and I don’t disagree; however, if the risk was only human rogue actors ordering the AGI around, then people might say that all we need to make it safe is keep it under lock and key so that it can only be used with authorisation by thoroughly vetted people, like we do for many dangerous things. The AGI itself being agentic makes it clear how hard it would be to control because you can’t forbid it from using itself.
About the ability of intelligence to affect the world, I agree on being sceptical of nigh magical abilities, but there obviously are very low hanging fruits in terms of killing most of humanity, especially bioweapons. Stuff for which even human intellect would be enough, given sufficient resources, that random death cultists don’t have but an AGI likely would.
I have seen this argument and I don’t disagree; however, if the risk was only human rogue actors ordering the AGI around, then people might say that all we need to make it safe is keep it under lock and key so that it can only be used with authorisation by thoroughly vetted people, like we do for many dangerous things. The AGI itself being agentic makes it clear how hard it would be to control because you can’t forbid it from using itself.
About the ability of intelligence to affect the world, I agree on being sceptical of nigh magical abilities, but there obviously are very low hanging fruits in terms of killing most of humanity, especially bioweapons. Stuff for which even human intellect would be enough, given sufficient resources, that random death cultists don’t have but an AGI likely would.