Cryonics seems inherently, and destructively, to the human race, grossly selfish. Not only is cryonics a huge cost that could be spent elsewhere helping others
It is not very expensive (and could get even cheaper — see Cryonics Wants To Be Big), and many of its supporters see it in a primarily humanitarian sense (advocating that it be easily and cheaply available to everyone, not just being concerned with having it themselves).
Also, as for “spent elsewhere helping others”: there are charities that can reliably save a life for between $200 and $1000. Every time you spend some amount of money on that order or some multiple thereof, do you think about whether what you’re doing is more important than that many children’s lives? Or is it just with medical procedures that have some chance of saving your life? Or is cryonics in particular being singled out for some reason?
nature and evolution thrive on the necessity of refreshing the population of each species. Though it’s speculation, I would assign the probability of evolution continuing to work (and improve) on the human race as pretty high—what gain does the human species have in preserving humans from the 21st century indefinitely, when 23rd century or later humans are better?
If we survive this century without going extinct or experiencing a civilization-level catastrophe, then we will probably overcome evolution (which deals with reproductive fitness in a given environment and has nothing to do with making us “better” as far as our actual values are concerned). We probably will become much better if we make it to the 23rd century, but by technogical means that can be applied just as well to revived 21st-century humans.
Overall, in no way can I think of cryonics benefiting anyone other than the individual’s (I think simply genetic) desire to avoid death
Every time you spend some amount of money on that order or some multiple thereof, do you think about whether what you’re doing is more important than that many children’s lives? Or is it just with medical procedures that have some chance of saving your life? Or is cryonics in particular being singled out for some reason?
Why encourage people to make a mistake in one domain, just because they’re already making the same mistake in many other domains?
It wasn’t completely a rhetorical question. I actually did want to see if jtolds applied that standard consistently; I wasn’t necessarily saying that that’s a bad idea. (Myself, I remain confused about that question. A typical consequentialism would seem to compel me to weigh any cost against that of saving a life, but I’m not sure whether I have to bite that bullet or if I’m doing something wrong.)
Whether or not it’s a given, it’s an assumption behind the particular argument I was responding to.
I disagree with the linked post but it would take some thinking/writing/comments-reading time to explain why. And surely if “shut up and divide” is a reason for egoism it’s also a reason for myopia?
The argument is that there is significant uncertainty, not that we certainly (or even probably) are that selfish. Values can in principle say anything about preferred shape of spacetime, so selfishness at present doesn’t automatically imply not caring about the future.
If due to scope insensitivity we care about living for a billion years only a million times as much as about eating an extra cookie, then if you apply Wei Dai’s “shut up and divide” principle, we should prefer the extra cookie to 500 years of life. (ETA: while this is extreme myopia, it may not be enough myopia to make cryo a bad idea.)
It is not very expensive (and could get even cheaper — see Cryonics Wants To Be Big), and many of its supporters see it in a primarily humanitarian sense (advocating that it be easily and cheaply available to everyone, not just being concerned with having it themselves).
Also, as for “spent elsewhere helping others”: there are charities that can reliably save a life for between $200 and $1000. Every time you spend some amount of money on that order or some multiple thereof, do you think about whether what you’re doing is more important than that many children’s lives? Or is it just with medical procedures that have some chance of saving your life? Or is cryonics in particular being singled out for some reason?
If we survive this century without going extinct or experiencing a civilization-level catastrophe, then we will probably overcome evolution (which deals with reproductive fitness in a given environment and has nothing to do with making us “better” as far as our actual values are concerned). We probably will become much better if we make it to the 23rd century, but by technogical means that can be applied just as well to revived 21st-century humans.
Should we not avoid death? Do you avoid death?
Why encourage people to make a mistake in one domain, just because they’re already making the same mistake in many other domains?
It wasn’t completely a rhetorical question. I actually did want to see if jtolds applied that standard consistently; I wasn’t necessarily saying that that’s a bad idea. (Myself, I remain confused about that question. A typical consequentialism would seem to compel me to weigh any cost against that of saving a life, but I’m not sure whether I have to bite that bullet or if I’m doing something wrong.)
It’s not a given that “excessive” egoism is a mistake.
Whether or not it’s a given, it’s an assumption behind the particular argument I was responding to.
I disagree with the linked post but it would take some thinking/writing/comments-reading time to explain why. And surely if “shut up and divide” is a reason for egoism it’s also a reason for myopia?
The argument is that there is significant uncertainty, not that we certainly (or even probably) are that selfish. Values can in principle say anything about preferred shape of spacetime, so selfishness at present doesn’t automatically imply not caring about the future.
If due to scope insensitivity we care about living for a billion years only a million times as much as about eating an extra cookie, then if you apply Wei Dai’s “shut up and divide” principle, we should prefer the extra cookie to 500 years of life. (ETA: while this is extreme myopia, it may not be enough myopia to make cryo a bad idea.)