at least some researchers don’t seem to consider that part of “alignment”.
It’s part of alignment. Also, it seems mostly separate from the part about “how do you even have consequentialism powerful enough to make, say, nanotech, without killing everyone as a side-effect?”, and the latter seems not too related to the former.
It’s part of alignment. Also, it seems mostly separate from the part about “how do you even have consequentialism powerful enough to make, say, nanotech, without killing everyone as a side-effect?”, and the latter seems not too related to the former.