I don’t think it is magic but it is still sufficiently disgusting to treat it with equal threat now. Red button now.
Its not a good idea to treat a disease right before it kills you: prevention is the way to go.
So no, I don’t think it is magic. But I do think just as the world agreed against human cloning long before there was a human clone, now is the time to act.
So gathering up your beliefs, you believe ASI/AGI to be a threat, but not so dangerous a threat you need to use nuclear weapons until an enemy nation with it is extremely far along, which will take, according to your beliefs, many years since it’s not that good.
But you find the very idea of non human intelligence in use by humans or possibly serving itself so disgusting that you want nuclear weapons used the instant anyone steps out of compliance with international rules you wish to impose. (Note this is historically unprecedented, arms control treaties have been voluntary and did not have immediate thermonuclear war as the penalty for violating them)
And since your beliefs are emotionally based on “disgust”, I assume there is no updating based on actual measurements? That is, if ASI turns out to be safer than you currently think, you still want immediate nukes, and vice versa?
What percentage of the population of world superpower decision makers do you feel share your belief? Just a rough guess is fine.
The point is that sanctions should be applied as necessary to discourage AGI, however, approximate grim triggers should apply as needed to prevent dystopia.
As the other commentators have mentioned, my reaction is not unusual and thus this is why the concerns of doom have been widespread.
I don’t think it is magic but it is still sufficiently disgusting to treat it with equal threat now. Red button now.
Its not a good idea to treat a disease right before it kills you: prevention is the way to go.
So no, I don’t think it is magic. But I do think just as the world agreed against human cloning long before there was a human clone, now is the time to act.
So gathering up your beliefs, you believe ASI/AGI to be a threat, but not so dangerous a threat you need to use nuclear weapons until an enemy nation with it is extremely far along, which will take, according to your beliefs, many years since it’s not that good.
But you find the very idea of non human intelligence in use by humans or possibly serving itself so disgusting that you want nuclear weapons used the instant anyone steps out of compliance with international rules you wish to impose. (Note this is historically unprecedented, arms control treaties have been voluntary and did not have immediate thermonuclear war as the penalty for violating them)
And since your beliefs are emotionally based on “disgust”, I assume there is no updating based on actual measurements? That is, if ASI turns out to be safer than you currently think, you still want immediate nukes, and vice versa?
What percentage of the population of world superpower decision makers do you feel share your belief? Just a rough guess is fine.
The point is that sanctions should be applied as necessary to discourage AGI, however, approximate grim triggers should apply as needed to prevent dystopia.
As the other commentators have mentioned, my reaction is not unusual and thus this is why the concerns of doom have been widespread.
So the answer is: enough.