Shouldn’t this outcome be something the CEV would avoid anyway? If it’s making an AI that wants what we would want, then it should not at the same time be making something we would not want to exist.
Shouldn’t this outcome be something the CEV would avoid anyway? If it’s making an AI that wants what we would want, then it should not at the same time be making something we would not want to exist.