Are you assuming that avoiding doom in this way will require a pivotal act? It seem absent policy intervention and societal change, even if some firms exhibit a proper amount of concern many others will not.
It’s unclear whether some people being cautious and some people being incautious leads to an AI takeover.
In this hypothetical, I’m including AI developers selling AI systems to law enforcement and militaries, which are used to enforce the law and win wars against competitors using AI. But I’m assuming we wouldn’t pass a bunch of new anti-AI laws (and that AI developers don’t become paramilitaries).
Are you assuming that avoiding doom in this way will require a pivotal act? It seem absent policy intervention and societal change, even if some firms exhibit a proper amount of concern many others will not.
It’s unclear whether some people being cautious and some people being incautious leads to an AI takeover.
In this hypothetical, I’m including AI developers selling AI systems to law enforcement and militaries, which are used to enforce the law and win wars against competitors using AI. But I’m assuming we wouldn’t pass a bunch of new anti-AI laws (and that AI developers don’t become paramilitaries).