I’m not entirely self-altruistic—I’ve currently got a pretty strong “don’t create multiple redundant copies of sentient beings”utility component, or shall we say gut instinct.
Is this a thing you’re saying for you personally, or people in general? Because if it’s not for everyone, then you still have to deal with the problem mentioned here.
Alright, let’s imagine that I was creating copies of myself for whatever reason:
In the present, I feel equal self-altruism towards all future identical copies of myself .
However, the moment a copy of myself is made, each copy will treat the other as a separate individual (with the regular old fashioned altruism one might have towards someone exactly like oneself, rather than future-self-altruism).
Is this a thing you’re saying for you personally, or people in general? Because if it’s not for everyone, then you still have to deal with the problem mentioned here.
Alright, let’s imagine that I was creating copies of myself for whatever reason:
In the present, I feel equal self-altruism towards all future identical copies of myself .
However, the moment a copy of myself is made, each copy will treat the other as a separate individual (with the regular old fashioned altruism one might have towards someone exactly like oneself, rather than future-self-altruism).