Transhumanists need to stop setting arbitrary dates within current life expectancies for when we allegedly “become immortal.” These forecasts make no logical sense, and you wind up sounding like asses for publishing them.
For example James D. Miller in Singularity Rising writes:
But now if you die before 2045, you might miss out on millions of years of life. The high cost of death has made survival to 2045 your top priority.
Uh, guys, plenty of people alive in 2014 will probably live another 31 years any way through natural maturation and aging; they won’t mysteriously become capable of living for “millions of years” by surviving to January 1, 2045.
If you want to set a date which shows some ambition and at least makes more sense than implying that living another 31 years = “living forever,” pick one in the 23rd Century like, say, 2245. If you can survive to 2245 in good shape, you might have successfully overcome major hurdles to your radical life extension so far. (“Past performance doesn’t guarantee future results.”)
This is from a subsection of my book that assumes someone gives you a magical scroll that contains numerous predictions that come true and a prediction that a singularity will occur in 2045. This was obviously a thought experiment about how you would behave if you somehow knew there would be a singularity in 2045, not an assertion that the singularity will happen in 2045. Indeed, I used the scroll device so the reader wouldn’t think I was predicting there would be a 2045 singularity.
Transhumanists need to stop setting arbitrary dates within current life expectancies for when we allegedly “become immortal.” These forecasts make no logical sense, and you wind up sounding like asses for publishing them.
For example James D. Miller in Singularity Rising writes:
Uh, guys, plenty of people alive in 2014 will probably live another 31 years any way through natural maturation and aging; they won’t mysteriously become capable of living for “millions of years” by surviving to January 1, 2045.
If you want to set a date which shows some ambition and at least makes more sense than implying that living another 31 years = “living forever,” pick one in the 23rd Century like, say, 2245. If you can survive to 2245 in good shape, you might have successfully overcome major hurdles to your radical life extension so far. (“Past performance doesn’t guarantee future results.”)
This is way out of context.
This is from a subsection of my book that assumes someone gives you a magical scroll that contains numerous predictions that come true and a prediction that a singularity will occur in 2045. This was obviously a thought experiment about how you would behave if you somehow knew there would be a singularity in 2045, not an assertion that the singularity will happen in 2045. Indeed, I used the scroll device so the reader wouldn’t think I was predicting there would be a 2045 singularity.