If someone figures out how to stop or reverse aging, presumably that will mean higher quality of life for old people, so the
more and more of the bad stuff
will no longer apply, at least in so far as the bad stuff is physical rather than mental. As for people “somewhere in the middle of the spectrum”—if we suppose roughly constant quality of life thanks to anti-aging treatments, always wanting to live a few more decades means never wanting to stop living just as much as always wanting to live a few more millennia does.
If I came to expect a thousand years of healthy life, I think I would be more inclined to find long-term purposes. (Maybe a succession of projects each taking a few decades to a few centuries.) Wouldn’t you? I mean, my motivation to try to cure aging, or prove the Riemann Hypothesis, or put an end to poverty or malaria, is less than it could be because I don’t expect to be able to make a very large contribution to any of those things. (I am not claiming that this is rational, that an ideal agent would have that motivational structure. Only that I do and I don’t think I’m alone.) If I thought I could make 50x as big a difference, I think I’d be willing to work harder. But I may be wrong; introspection is unreliable.
If someone figures out how to stop or reverse aging, presumably that will mean higher quality of life for old people, so the
will no longer apply, at least in so far as the bad stuff is physical rather than mental. As for people “somewhere in the middle of the spectrum”—if we suppose roughly constant quality of life thanks to anti-aging treatments, always wanting to live a few more decades means never wanting to stop living just as much as always wanting to live a few more millennia does.
If I came to expect a thousand years of healthy life, I think I would be more inclined to find long-term purposes. (Maybe a succession of projects each taking a few decades to a few centuries.) Wouldn’t you? I mean, my motivation to try to cure aging, or prove the Riemann Hypothesis, or put an end to poverty or malaria, is less than it could be because I don’t expect to be able to make a very large contribution to any of those things. (I am not claiming that this is rational, that an ideal agent would have that motivational structure. Only that I do and I don’t think I’m alone.) If I thought I could make 50x as big a difference, I think I’d be willing to work harder. But I may be wrong; introspection is unreliable.