In a sufficiently bad situation, you may wish for the genie to kill you because you think that’s your only wish. It’s not likely for any given wish, but would happen eventually (and ends the recursion, so that’s one of the few stable wishes).
If I kill myself, there is no nth wish as n → infinity, or a busy beaver function of Graham’s numberth wish, so the first wish is wishing for something undefined.
Also, the probability of any of the individually improbable events where I kill myself happening is bounded above by the some of their probabilities, and they could be a convergent infinite series, if the probability of wanting to kill myself went down each time. Even though I stipulated that it’s if I believed each wish was the last, I might do something like “except don’t grant this wish if it would result in me wanting to kill myself or dying before I could consider the question” in each hypothetical wish. Or grant myself superintelligence as part of one of the hypothetical wishes, and come up with an even better safeguard when I found myself (to my great irrational surprise) getting another wish.
There is not even necessarily a tiny chance of wanting to kill myself. Good epistemology says to think there is, just in case, but some things are actually impossible. Using wishes to make it impossible for me to want to kill myself might come faster than killing myself.
If I kill myself, there is no nth wish as n → infinity, or a busy beaver function of Graham’s numberth wish, so the first wish is wishing for something undefined.
I think you’re right, though I’m not sure that’s exactly a good thing.
and they could be a convergent infinite series, if the probability of wanting to kill myself went down each time.
I see no particular reason to expect that to be the case.
Using wishes to make it impossible for me to want to kill myself might come faster than killing myself.
Excellent point. That might just work (though I’m sure there are still a thousand ways it could go mind-bogglingly wrong).
In a sufficiently bad situation, you may wish for the genie to kill you because you think that’s your only wish. It’s not likely for any given wish, but would happen eventually (and ends the recursion, so that’s one of the few stable wishes).
If I kill myself, there is no nth wish as n → infinity, or a busy beaver function of Graham’s numberth wish, so the first wish is wishing for something undefined.
Also, the probability of any of the individually improbable events where I kill myself happening is bounded above by the some of their probabilities, and they could be a convergent infinite series, if the probability of wanting to kill myself went down each time. Even though I stipulated that it’s if I believed each wish was the last, I might do something like “except don’t grant this wish if it would result in me wanting to kill myself or dying before I could consider the question” in each hypothetical wish. Or grant myself superintelligence as part of one of the hypothetical wishes, and come up with an even better safeguard when I found myself (to my great irrational surprise) getting another wish.
There is not even necessarily a tiny chance of wanting to kill myself. Good epistemology says to think there is, just in case, but some things are actually impossible. Using wishes to make it impossible for me to want to kill myself might come faster than killing myself.
I think you’re right, though I’m not sure that’s exactly a good thing.
I see no particular reason to expect that to be the case.
Excellent point. That might just work (though I’m sure there are still a thousand ways it could go mind-bogglingly wrong).