I think all AI research makes AGI easier, so “non-AGI AI research” might not be a thing. And even if I’m wrong about that, it also seems to me that most harms of AGI could come from tool AI + humans just as well. So I’m not sure the question is right. Tbh I’d just stop most AI work.
I agree that tool AI + humans can create a lot of large magnitude harms. I think probably its still quite a bit less bad than directly having a high intelligence, faster-than-human, self-duplicating, anti-human AGI on the loose.
The trouble though is that with sufficient available compute, sufficient broad scientific knowledge about the brain and learning algorithms, and sufficiently powerful tool AI… It becomes trivially fast and easy for a single well-resourced human to make the unwise irreversible decision to create and unleash a powerful unaligned AGI.
If anyone on Earth had the option to anonymously purchase a nuclear bomb for $10k at any time, I don’t expect a law against owning or using nuclear weapons would prevent all use of nuclear weapons. Sometimes people do bad things.
I think all AI research makes AGI easier, so “non-AGI AI research” might not be a thing. And even if I’m wrong about that, it also seems to me that most harms of AGI could come from tool AI + humans just as well. So I’m not sure the question is right. Tbh I’d just stop most AI work.
I agree that tool AI + humans can create a lot of large magnitude harms. I think probably its still quite a bit less bad than directly having a high intelligence, faster-than-human, self-duplicating, anti-human AGI on the loose. The trouble though is that with sufficient available compute, sufficient broad scientific knowledge about the brain and learning algorithms, and sufficiently powerful tool AI… It becomes trivially fast and easy for a single well-resourced human to make the unwise irreversible decision to create and unleash a powerful unaligned AGI.
If anyone on Earth had the option to anonymously purchase a nuclear bomb for $10k at any time, I don’t expect a law against owning or using nuclear weapons would prevent all use of nuclear weapons. Sometimes people do bad things.
AI + Humans would just eventually give rise to AGI anyway so I dont see the distinction people try to make here.