I care about longevity; I donate to longevity research institutions. I also try to live healthily.
That said, I’m also in my early 30s. I just took an actuarial table and my rough probability distribution of when I expect transformative AI to be possible and calculated my probability of dying vs. my probability of seeing transformative AI, and ended up with 23% and 77%. So, like, even if I’m totally selfish, on my beliefs it seems three times more important to do something about the Singularity than all-cause mortality.
This is less true the older someone is, of course.
Maybe I am misreading this, but when they say “using the mortality rates for 2019”, I think they are assuming that there won’t be increases in life expectancy. Like, we’re currently observing that people born in the 1930s living ~80 years, and so we’ll assume that people born in eg. the 1980s will also live ~80 years. But that seems like a very bad assumption to me.
I care about longevity; I donate to longevity research institutions. I also try to live healthily.
That said, I’m also in my early 30s. I just took an actuarial table and my rough probability distribution of when I expect transformative AI to be possible and calculated my probability of dying vs. my probability of seeing transformative AI, and ended up with 23% and 77%. So, like, even if I’m totally selfish, on my beliefs it seems three times more important to do something about the Singularity than all-cause mortality.
This is less true the older someone is, of course.
Maybe I am misreading this, but when they say “using the mortality rates for 2019”, I think they are assuming that there won’t be increases in life expectancy. Like, we’re currently observing that people born in the 1930s living ~80 years, and so we’ll assume that people born in eg. the 1980s will also live ~80 years. But that seems like a very bad assumption to me.