Every time you spend some amount of money on that order or some multiple thereof, do you think about whether what you’re doing is more important than that many children’s lives? Or is it just with medical procedures that have some chance of saving your life? Or is cryonics in particular being singled out for some reason?
Why encourage people to make a mistake in one domain, just because they’re already making the same mistake in many other domains?
It wasn’t completely a rhetorical question. I actually did want to see if jtolds applied that standard consistently; I wasn’t necessarily saying that that’s a bad idea. (Myself, I remain confused about that question. A typical consequentialism would seem to compel me to weigh any cost against that of saving a life, but I’m not sure whether I have to bite that bullet or if I’m doing something wrong.)
Whether or not it’s a given, it’s an assumption behind the particular argument I was responding to.
I disagree with the linked post but it would take some thinking/writing/comments-reading time to explain why. And surely if “shut up and divide” is a reason for egoism it’s also a reason for myopia?
The argument is that there is significant uncertainty, not that we certainly (or even probably) are that selfish. Values can in principle say anything about preferred shape of spacetime, so selfishness at present doesn’t automatically imply not caring about the future.
If due to scope insensitivity we care about living for a billion years only a million times as much as about eating an extra cookie, then if you apply Wei Dai’s “shut up and divide” principle, we should prefer the extra cookie to 500 years of life. (ETA: while this is extreme myopia, it may not be enough myopia to make cryo a bad idea.)
Why encourage people to make a mistake in one domain, just because they’re already making the same mistake in many other domains?
It wasn’t completely a rhetorical question. I actually did want to see if jtolds applied that standard consistently; I wasn’t necessarily saying that that’s a bad idea. (Myself, I remain confused about that question. A typical consequentialism would seem to compel me to weigh any cost against that of saving a life, but I’m not sure whether I have to bite that bullet or if I’m doing something wrong.)
It’s not a given that “excessive” egoism is a mistake.
Whether or not it’s a given, it’s an assumption behind the particular argument I was responding to.
I disagree with the linked post but it would take some thinking/writing/comments-reading time to explain why. And surely if “shut up and divide” is a reason for egoism it’s also a reason for myopia?
The argument is that there is significant uncertainty, not that we certainly (or even probably) are that selfish. Values can in principle say anything about preferred shape of spacetime, so selfishness at present doesn’t automatically imply not caring about the future.
If due to scope insensitivity we care about living for a billion years only a million times as much as about eating an extra cookie, then if you apply Wei Dai’s “shut up and divide” principle, we should prefer the extra cookie to 500 years of life. (ETA: while this is extreme myopia, it may not be enough myopia to make cryo a bad idea.)