This is also why people have that status quo bias—no one wants to die of starving, even with ‘pleasure’ button.
It was my understanding that the hypothetical scenario ruled this out (hence the abnormally long lifespan).
In any event, an FAI would want to maximize its utility, so making its utility contingent on the amount of pleasure going on it seems probable that it would want to make as many humans as possible and make them live as long as possible in a wirehead simulation.
It was my understanding that the hypothetical scenario ruled this out (hence the abnormally long lifespan).
In any event, an FAI would want to maximize its utility, so making its utility contingent on the amount of pleasure going on it seems probable that it would want to make as many humans as possible and make them live as long as possible in a wirehead simulation.