to me, the idea that the AI will choose to keep humans around for all eternity, is scarier than that it will not. But that is something Eliezer either disagrees with, or has deliberately made obscure.
Wouldn’t it make sense to keep some humans around for all eternity—in the history simul-books? That seems to make sense—and not be especially scary.
Sure. Tiling the universe largely with humans is the strong scary idea. Locking in human values for the rest of the universe is the weak scary idea. Unless the first doesn’t imply the second; in which case I don’t know which is more scary.
Wouldn’t it make sense to keep some humans around for all eternity—in the history simul-books? That seems to make sense—and not be especially scary.
Sure. Tiling the universe largely with humans is the strong scary idea. Locking in human values for the rest of the universe is the weak scary idea. Unless the first doesn’t imply the second; in which case I don’t know which is more scary.