Everyone who had serious philosophical conundra on that subject just, you know, died, a generation before. The Bitchun Society didn’t need to convert its detractors, just outlive them.
Even if you don’t think life extension technologies are a good thing, it’s only a matter of time before almost everyone thinks they are. Whatever part of “humanity” you value more than life will be gone forever.
ETA: Actually, there is an out: if you build FAI or some sort of world government and it enforces 20th century life spans on people. I can’t say natural life spans because our lives were much shorter before modern sanitation and medicine.
Even if you don’t think life extension technologies are a good thing, it’s only a matter of time before almost everyone thinks they are. Whatever part of “humanity” you value more than life will be gone forever.
Doesn’t this argument imply that we should self-modify to become monomaniacal fitness-maximizers, devoting every quantum of effort towards the goal of tiling the universe with copies of ourselves? Hey, if you don’t, someone else will! Natural selection marches on; it’s only a matter of time.
I find the likelihood of someone eventually doing this successfully to be very scary. And more generally, the likelihood of natural selection continuing post-AGI, leading to more Hansonian/Malthusian futures.
You seem to have two objections to cryonics:
Cryonics won’t work.
Life extension is bad.
#1 is better addressed by the giant amount of information already written on the subject.
For #2 I’d like to quote a bit of Down and Out in the Magic Kingdom:
Even if you don’t think life extension technologies are a good thing, it’s only a matter of time before almost everyone thinks they are. Whatever part of “humanity” you value more than life will be gone forever.
ETA: Actually, there is an out: if you build FAI or some sort of world government and it enforces 20th century life spans on people. I can’t say natural life spans because our lives were much shorter before modern sanitation and medicine.
Doesn’t this argument imply that we should self-modify to become monomaniacal fitness-maximizers, devoting every quantum of effort towards the goal of tiling the universe with copies of ourselves? Hey, if you don’t, someone else will! Natural selection marches on; it’s only a matter of time.
I find the likelihood of someone eventually doing this successfully to be very scary. And more generally, the likelihood of natural selection continuing post-AGI, leading to more Hansonian/Malthusian futures.
For #2, there’s also Nick Bostrom’s Fable of the Dragon-Tyrant.