I think there are LOTS of goals one could have that don’t require a guarantee of extended life. Many moments of joy, positive impact on other people (current and future), sailing the Carribean, etc. In fact, I don’t support any goals that require you specifically to live for a long long time, as opposed to being part of a cooperative structure which lasts beyond any of it’s current members.
I personally have a preference to live longer—I have a personal stake in my own experiences, which does not apply to other people. That is a form of goal, but I see small enough chance of success that I don’t prioritize it very highly—there’s not much I’d give up over the next 30 years for a very tiny increase in the chance I’ll live to 1000.
If AGI kills us all longevity in the sense of biological longevity doesn’t give you much.
In EA spheres there’s an idea that it’s easier to save lives through medical interventions in the third world then through longevity research.
As far as the general society is concered there’s what Aubreg de Grey calls the pro-death trance and you find plenty of discussion from him and others about why it exists.
Really good point. Though I would argue that most deadlined life goals have deadlines only because of mortality itself. I’m trying to think of an example of a life goal with deadline even if immortality is achieved, but it seems hard to find one.
The 2nd may have a (close) deadline, the 1st might have a distant deadline like the sun burns out, or something closer like before you die, or ‘an AGI revolution (like the industrial revolution) starts’ (assuming you think AGI will happen before the sun burns out).
Surely, you’ve heard the adage that humans can adapt to anything? They have probably adapted to death, and removing that psychological adaption that has probably been with humans since they became smart enough to understand that death is a thing. I would expect it to be really hard to change or remove it (in fact, Terror Management Theory goes even further and argues that much of our psychology is built on the denial of or dealing with death).
Why is longevity not the number 1 goal for most humans?
Any goal you’d have would be achieved better with sufficent longevity.
Naturally, eternal life is the first goal of my life.
But to achieve this, global cooperative effort would be required push the science forward.
Therefore nowadays I’m mostly thinking about why longevity seems not in most people’s concern.
In my worldview, longevity should be up there with ESGs in decision making process.
But in reality, no one really talks about it.
In conclusion I have two questions:
Is putting longevity over any other goal a rational decision?
And if so, why isn’t general population on board with it?
I think there are LOTS of goals one could have that don’t require a guarantee of extended life. Many moments of joy, positive impact on other people (current and future), sailing the Carribean, etc. In fact, I don’t support any goals that require you specifically to live for a long long time, as opposed to being part of a cooperative structure which lasts beyond any of it’s current members.
I personally have a preference to live longer—I have a personal stake in my own experiences, which does not apply to other people. That is a form of goal, but I see small enough chance of success that I don’t prioritize it very highly—there’s not much I’d give up over the next 30 years for a very tiny increase in the chance I’ll live to 1000.
If AGI kills us all longevity in the sense of biological longevity doesn’t give you much.
In EA spheres there’s an idea that it’s easier to save lives through medical interventions in the third world then through longevity research.
As far as the general society is concered there’s what Aubreg de Grey calls the pro-death trance and you find plenty of discussion from him and others about why it exists.
That is false for a lot of goals, including goals that have a deadline.
Really good point. Though I would argue that most deadlined life goals have deadlines only because of mortality itself. I’m trying to think of an example of a life goal with deadline even if immortality is achieved, but it seems hard to find one.
Two versions of a goal:
World Peace
Preventing a war you think is going to happen
The 2nd may have a (close) deadline, the 1st might have a distant deadline like the sun burns out, or something closer like before you die, or ‘an AGI revolution (like the industrial revolution) starts’ (assuming you think AGI will happen before the sun burns out).
Surely, you’ve heard the adage that humans can adapt to anything? They have probably adapted to death, and removing that psychological adaption that has probably been with humans since they became smart enough to understand that death is a thing. I would expect it to be really hard to change or remove it (in fact, Terror Management Theory goes even further and argues that much of our psychology is built on the denial of or dealing with death).