Michael Vassar, I read that and laughed and said, “Oh, great, now I’ve got to play the thought experiment again in this new version.”
Albeit I would postulate that on every occasion, the FAI underwent the water-flowing-downhill automatic shutdown that was automatically enginereed into it, with the stop code “desirability differentials vanished”.
The responses that occurred to me—and yes, I had to think about it for a while—would be as follows:
*) Peek at the code. Figure out what happened. Go on from there.
Assuming we don’t allow that (and it’s not in the spirit of the thought experiment), then:
*) Try running the FAI at simpler extrapolations until it preserved desirability; stop worrying about anything that was in the desirability-killing extrapolations. So if being “more the people we wished we were” was the desirability-killer, then I would stop worrying about that, and update my morality accordingly.
*) Transform myself to something with a coherent morality.
*) Proceed as before, but with a shorter-term focus on when my life’s goals are to be achieved, thinking less about the far future—as if you told me that, no matter what, I had to die before a thousand years were up.
Michael Vassar, I read that and laughed and said, “Oh, great, now I’ve got to play the thought experiment again in this new version.”
Albeit I would postulate that on every occasion, the FAI underwent the water-flowing-downhill automatic shutdown that was automatically enginereed into it, with the stop code “desirability differentials vanished”.
The responses that occurred to me—and yes, I had to think about it for a while—would be as follows:
*) Peek at the code. Figure out what happened. Go on from there.
Assuming we don’t allow that (and it’s not in the spirit of the thought experiment), then:
*) Try running the FAI at simpler extrapolations until it preserved desirability; stop worrying about anything that was in the desirability-killing extrapolations. So if being “more the people we wished we were” was the desirability-killer, then I would stop worrying about that, and update my morality accordingly.
*) Transform myself to something with a coherent morality.
*) Proceed as before, but with a shorter-term focus on when my life’s goals are to be achieved, thinking less about the far future—as if you told me that, no matter what, I had to die before a thousand years were up.