Thanks for your response, just a few of my thoughts on your points:
If you *can* stop doing philosophy and futurism
To be honest, I’ve never really *wanted* to be involved with this. I only really made an account here *because* of my anxieties and wanted to try to talk myself through them.
If an atom-for-atom identical copy of you, *is* you, and an *almost* identical copy is *almost* you, then in a sufficiently large universe where all possible configurations of matter are realized, it makes more sense to think about the relative measure of different configurations rather than what happens to “you”.
I don’t buy that theory of personal-identity personally. It seems to me that if the biological me that’s sitting here right now isn’t *feeling* the pain, that’s not worth worrying about as much. Like, I can *imagine* that a version of me might be getting tortured horribly or experiencing endless bliss, but my consciousness doesn’t (as far as I can tell) “jump” over to those versions. Similarly, were *I* to get tortured it’d be unlikely that I care about what’s happening to the “other” versions of me. The “continuity of consciousness” theory *seems* stronger to me, although admittedly it’s not something I’ve put a lot of thought into. I wouldn’t want to use a teleporter for the same reasons.
*And* there are evolutionary reasons for a creature like you to be *more* unable to imagine the scope of the great things.
Yes, I agree that it’s possible that the future could be just as good as an infinite torture future would be bad. And that my intuitions are somewhat lopsided. But I do struggle to find that comforting. Were an infinite-torture future realised (whether it be a SignFlip error, an insane neuromorph, etc.) the fact that I could’ve ended up in a utopia wouldn’t console me one bit.
Thanks for your response, just a few of my thoughts on your points:
To be honest, I’ve never really *wanted* to be involved with this. I only really made an account here *because* of my anxieties and wanted to try to talk myself through them.
I don’t buy that theory of personal-identity personally. It seems to me that if the biological me that’s sitting here right now isn’t *feeling* the pain, that’s not worth worrying about as much. Like, I can *imagine* that a version of me might be getting tortured horribly or experiencing endless bliss, but my consciousness doesn’t (as far as I can tell) “jump” over to those versions. Similarly, were *I* to get tortured it’d be unlikely that I care about what’s happening to the “other” versions of me. The “continuity of consciousness” theory *seems* stronger to me, although admittedly it’s not something I’ve put a lot of thought into. I wouldn’t want to use a teleporter for the same reasons.
Yes, I agree that it’s possible that the future could be just as good as an infinite torture future would be bad. And that my intuitions are somewhat lopsided. But I do struggle to find that comforting. Were an infinite-torture future realised (whether it be a SignFlip error, an insane neuromorph, etc.) the fact that I could’ve ended up in a utopia wouldn’t console me one bit.