Sometimes I still marvel about how in most time-travel stories nobody thinks of this.
Perhaps something is simply wrong with my moral processing module, but I really don’t see any morality issues associated with this sort of thing. Morality is, in my opinion, defined by society and environment; sure, killing people in the here and now in which we live is (in general) morally wrong, but if you go too away from that here and now, morality as we know it breaks down.
One area where I feel this applies is the “countless universes” arena. In most of these cases, we’re bandying about entire universes in such a fashion that you can effectively stamp them with an (admittedly very large) numerical index that describes them completely. At that point, you’re so far out of the context in which our morality is geared that I don’t feel it makes sense.
Suppose an entire universe full of people (described by a huge number we’ll denote as X) is destroyed, and another one also full of people (described by another huge number we’ll denote as Y) is created in it’s place—just exactly what moral context am I supposed to operate on here? We’ve basically defined people as chunks of a bitstream which can be trivially created, destroyed, and reproduced without loss. This is completely out of scope of the standard basis for our morality: that people and intelligent thought are somehow special and sacred.
Intelligence might be special and sacred when operating inside of a universe where it is uncommon and/or ultra-powerful. However when talking about universes as mere objects to be tossed around, to be created and destroyed and copied and modified like data files, moral qualms about the contents suddenly seem a less valid.
Note: Even just reducing the “specialness” of intelligence has similar effects for me. Consider the situation where you could create identical copies of yourself for zero cost, but that each had maintenance and upkeep costs (food, housing, etc.) and that you did not know which one was the “original”.
In this environment, if I had sufficient belief in the copying process, I would have no moral qualms whatsoever about creating a copy of myself to take out the trash, and then self destructing that copy. The mechanism to determine the “trash copy”? A coin flip. If I lose, I take out the trash and shut myself off; otherwise, I keep doing the important stuff. I don’t even flinch away from the idea of myself “taking out the trash then self destructing”. It seems like a perfectly normal, day to day thing, as long as the tasks I planned out continue to get done.
Perhaps something is simply wrong with my moral processing module, but I really don’t see any morality issues associated with this sort of thing. Morality is, in my opinion, defined by society and environment; sure, killing people in the here and now in which we live is (in general) morally wrong, but if you go too away from that here and now, morality as we know it breaks down.
One area where I feel this applies is the “countless universes” arena. In most of these cases, we’re bandying about entire universes in such a fashion that you can effectively stamp them with an (admittedly very large) numerical index that describes them completely. At that point, you’re so far out of the context in which our morality is geared that I don’t feel it makes sense.
Suppose an entire universe full of people (described by a huge number we’ll denote as X) is destroyed, and another one also full of people (described by another huge number we’ll denote as Y) is created in it’s place—just exactly what moral context am I supposed to operate on here? We’ve basically defined people as chunks of a bitstream which can be trivially created, destroyed, and reproduced without loss. This is completely out of scope of the standard basis for our morality: that people and intelligent thought are somehow special and sacred.
Intelligence might be special and sacred when operating inside of a universe where it is uncommon and/or ultra-powerful. However when talking about universes as mere objects to be tossed around, to be created and destroyed and copied and modified like data files, moral qualms about the contents suddenly seem a less valid.
Note: Even just reducing the “specialness” of intelligence has similar effects for me. Consider the situation where you could create identical copies of yourself for zero cost, but that each had maintenance and upkeep costs (food, housing, etc.) and that you did not know which one was the “original”.
In this environment, if I had sufficient belief in the copying process, I would have no moral qualms whatsoever about creating a copy of myself to take out the trash, and then self destructing that copy. The mechanism to determine the “trash copy”? A coin flip. If I lose, I take out the trash and shut myself off; otherwise, I keep doing the important stuff. I don’t even flinch away from the idea of myself “taking out the trash then self destructing”. It seems like a perfectly normal, day to day thing, as long as the tasks I planned out continue to get done.