Anyway, it feels completely ridiculous to talk about it in the first place. There will never be a mind that can quickly and vastly improve itself and then invent all kinds of technological magic to wipe us out. Even most science fiction books avoid that because it sounds too implausible
Do you acknowledge that :
We will some day make an AI that is at least as smart as humans?
Humans do try to improve their intelligence (rationality/memory training being a weak example, cyborg research being a better example, and im pretty sure we will soon design physical augmentations to improve our intelligence)
If you acknowledge 1 and 2, then that implies there can (and probably will) be an AI that tries to improve itself
I think you missed the “quickly and vastly” part as well as the “and then invent all kinds of technological magic to wipe us out”. Note I still think XiXiDu is wrong to be as confident as he is (assuming “there will never” implies >90% certainty), but if you are going to engage with him then you should engage with his actual arguments.
Do you acknowledge that :
We will some day make an AI that is at least as smart as humans?
Humans do try to improve their intelligence (rationality/memory training being a weak example, cyborg research being a better example, and im pretty sure we will soon design physical augmentations to improve our intelligence)
If you acknowledge 1 and 2, then that implies there can (and probably will) be an AI that tries to improve itself
I think you missed the “quickly and vastly” part as well as the “and then invent all kinds of technological magic to wipe us out”. Note I still think XiXiDu is wrong to be as confident as he is (assuming “there will never” implies >90% certainty), but if you are going to engage with him then you should engage with his actual arguments.