This is a good question and I’d be interested to hear answers to it as well.
Briefly, I’ll say that there seem to be plenty of reductio-ad-absurdum arguments for a “self” existing at all, such as the implication of philosophical zombies and the like. Rob Bensinger’s post here goes into this matter a bit.
If these arguments have validity, then it seems to me that neither “annihilation” nor “immortality” can actually be true. In short, it might be that “patternism” probably implies that there is no “you” at all, besides the conceptual construct your brain has created for itself. But these questions are really important to me to get right, because it calls into question whether it is rational, or moral, to try and preserve myself through the use of technologies such as cryonics, if that effort and money can be used somewhere else for a different moral good.
If, by a reduction of “I” to a pattern, you stop being a moral subject, then surely to all other people you can apply the same reduction and they stop being moral subjects too.
True, but there are also the questions of, should I try to ensure that exactly one of myself exists at any given moment? Instead of just trying to preserve my body, should I be content with just creating a million copies of my mind in some simulation, apathetic to whether or not my “original” still exists anywhere? Or should I be content with creating agents that aren’t like me and don’t have my exact history of experiences, but have the same goals as I do? It seems that these issues depend on sort of dualist questions of mind and whether or not there is moral value assigned to preserving this “self”.
This is a good question and I’d be interested to hear answers to it as well.
Briefly, I’ll say that there seem to be plenty of reductio-ad-absurdum arguments for a “self” existing at all, such as the implication of philosophical zombies and the like. Rob Bensinger’s post here goes into this matter a bit.
If these arguments have validity, then it seems to me that neither “annihilation” nor “immortality” can actually be true. In short, it might be that “patternism” probably implies that there is no “you” at all, besides the conceptual construct your brain has created for itself. But these questions are really important to me to get right, because it calls into question whether it is rational, or moral, to try and preserve myself through the use of technologies such as cryonics, if that effort and money can be used somewhere else for a different moral good.
If, by a reduction of “I” to a pattern, you stop being a moral subject, then surely to all other people you can apply the same reduction and they stop being moral subjects too.
True, but there are also the questions of, should I try to ensure that exactly one of myself exists at any given moment? Instead of just trying to preserve my body, should I be content with just creating a million copies of my mind in some simulation, apathetic to whether or not my “original” still exists anywhere? Or should I be content with creating agents that aren’t like me and don’t have my exact history of experiences, but have the same goals as I do? It seems that these issues depend on sort of dualist questions of mind and whether or not there is moral value assigned to preserving this “self”.