I’ll make an analogy here as to get around the AI-worship induced gut reactions:
I think most people are fairly convinced there isn’t a moral imperative beyond their own life, as in, even if behaving as if your own life is the ultimate driver of moral value is wrong and ineffective, from a logical standpoint it is, once your conscious experience ends everything ends.
I’m not saying this is certain, it may be that the line between conscious states is so blurry that continuity between sleep and awakenes is basically 0, or as much as that between you can other completely different humans (which will be alive even once you die and will keep on flourishing). It may be that there is a ghost in the machine under whatever metaphysical framework you want… but, if I had to take a bet, I’d say something like … 15,40,60% chance that once you close your eyes is over, the universe is done for.
I think many people accept this viewpoint, but most of them don’t spend even a moment thinking about anti-aging, even those like myself that do, aren’t to concerned about death in a “mood” sense. Why would you be? It’s inevitable, like, yeah, your actions might contribute to averting death by 0.x% if you’re very lucky and so you should pursue that area because… well, nothing better to do, right? But it makes no sense to concern oneself about death in an emotional way since it’s likely coming anyway.
After all the purpose of life is living, and if you’re not living because you’re worrying about death you lost, even in the case where you were able to defeat death, you still lost, you didn’t live, or less metaphorically you lived a life of suffering, or of unmeet potential.
Nor does it help to be parallelized by the fear of death every waking moment of one’s life. It will likely make you less able to destory the very evil you are oposing.
Such is the case with every potential horrible inevitability in life, even if it is “absolute” in it’s bad-ness, being afraid of it will not make ir easier to avoid and it might ultimately defeat the purpose of avoiding it, which is the happiness of you and the people you care about, since all of those will be more miserable if you are paralleized by fear.
So even if whatever fake model you had assumed a 99.9% chance of being destroyed by HAL or whatever in 10 years from now, it would still be the most sensible course of action to not get too emotional about the whole thing.
I’ll make an analogy here as to get around the AI-worship induced gut reactions:
I think most people are fairly convinced there isn’t a moral imperative beyond their own life, as in, even if behaving as if your own life is the ultimate driver of moral value is wrong and ineffective, from a logical standpoint it is, once your conscious experience ends everything ends.
I’m not saying this is certain, it may be that the line between conscious states is so blurry that continuity between sleep and awakenes is basically 0, or as much as that between you can other completely different humans (which will be alive even once you die and will keep on flourishing). It may be that there is a ghost in the machine under whatever metaphysical framework you want… but, if I had to take a bet, I’d say something like … 15,40,60% chance that once you close your eyes is over, the universe is done for.
I think many people accept this viewpoint, but most of them don’t spend even a moment thinking about anti-aging, even those like myself that do, aren’t to concerned about death in a “mood” sense. Why would you be? It’s inevitable, like, yeah, your actions might contribute to averting death by 0.x% if you’re very lucky and so you should pursue that area because… well, nothing better to do, right? But it makes no sense to concern oneself about death in an emotional way since it’s likely coming anyway.
After all the purpose of life is living, and if you’re not living because you’re worrying about death you lost, even in the case where you were able to defeat death, you still lost, you didn’t live, or less metaphorically you lived a life of suffering, or of unmeet potential.
Nor does it help to be parallelized by the fear of death every waking moment of one’s life. It will likely make you less able to destory the very evil you are oposing.
Such is the case with every potential horrible inevitability in life, even if it is “absolute” in it’s bad-ness, being afraid of it will not make ir easier to avoid and it might ultimately defeat the purpose of avoiding it, which is the happiness of you and the people you care about, since all of those will be more miserable if you are paralleized by fear.
So even if whatever fake model you had assumed a 99.9% chance of being destroyed by HAL or whatever in 10 years from now, it would still be the most sensible course of action to not get too emotional about the whole thing.