I disagree that rapid self improvement and goal stability are load-bearing arguments here. Even goals are not strictly, 100% required. If we build something with the means to kill everyone, then we should be worried about it. If it has goals that cannot be directed of predicted, then we should be VERY worried about it.
What are the steps? Are we deliberately building a superintelligence with the goal of killing us all? If not, where do the motivation and ability come from?
For me, ability = capability = means. This is one of the two arguments that I said were load bearing. Where will it come from? Well, we are specifically trying to build the most capable systems possible.
Motivation (ie goals) is not actually strictly required. However, there are reasons to think that an AGI could have goals that are not aligned with most humans. The most fundamental is instrumental convergence.
Note that my original comment was not making this case. It was just a meta discussion about what it would take to refute Eliezer’s argument.
I disagree that rapid self improvement and goal stability are load-bearing arguments here. Even goals are not strictly, 100% required. If we build something with the means to kill everyone, then we should be worried about it. If it has goals that cannot be directed of predicted, then we should be VERY worried about it.
What are the steps? Are we deliberately building a superintelligence with the goal of killing us all? If not, where do the motivation and ability come from?
For me, ability = capability = means. This is one of the two arguments that I said were load bearing. Where will it come from? Well, we are specifically trying to build the most capable systems possible.
Motivation (ie goals) is not actually strictly required. However, there are reasons to think that an AGI could have goals that are not aligned with most humans. The most fundamental is instrumental convergence.
Note that my original comment was not making this case. It was just a meta discussion about what it would take to refute Eliezer’s argument.