If a human child grew up in a less painful world—if they had never lived in a world of AIDS or cancer or slavery, and so did not know these things as evils that had been triumphantly eliminated—and so did not feel that they were “already done” or that the world was “already changed enough”… Would they take the next step, and try to eliminate the unbearable pain of broken hearts, when someone’s lover stops loving them?
Here is a more instructive thought experiment. Suppose a human child grew up in a painless world and did not feel that pain was already there or that the world had already changed enough. Should she try to create, in that possible world, the kind of pain that Eliezer doesn’t think we should destroy in the actual world?
Utilitarian would rightly attack this, since the probabilities almost certainly won’t wind up exactly balancing.
Utilitarian’s reply seems to assume that probability assignments are always precise. We may plausibly suppose, however, that belief states are sometimes vague. Granted this supposition, we cannot infer that one probability is higher than the other from the fact that probabilities do now wind up exactly balancing.