Whether or not it’s a given, it’s an assumption behind the particular argument I was responding to.
I disagree with the linked post but it would take some thinking/writing/comments-reading time to explain why. And surely if “shut up and divide” is a reason for egoism it’s also a reason for myopia?
The argument is that there is significant uncertainty, not that we certainly (or even probably) are that selfish. Values can in principle say anything about preferred shape of spacetime, so selfishness at present doesn’t automatically imply not caring about the future.
If due to scope insensitivity we care about living for a billion years only a million times as much as about eating an extra cookie, then if you apply Wei Dai’s “shut up and divide” principle, we should prefer the extra cookie to 500 years of life. (ETA: while this is extreme myopia, it may not be enough myopia to make cryo a bad idea.)
It’s not a given that “excessive” egoism is a mistake.
Whether or not it’s a given, it’s an assumption behind the particular argument I was responding to.
I disagree with the linked post but it would take some thinking/writing/comments-reading time to explain why. And surely if “shut up and divide” is a reason for egoism it’s also a reason for myopia?
The argument is that there is significant uncertainty, not that we certainly (or even probably) are that selfish. Values can in principle say anything about preferred shape of spacetime, so selfishness at present doesn’t automatically imply not caring about the future.
If due to scope insensitivity we care about living for a billion years only a million times as much as about eating an extra cookie, then if you apply Wei Dai’s “shut up and divide” principle, we should prefer the extra cookie to 500 years of life. (ETA: while this is extreme myopia, it may not be enough myopia to make cryo a bad idea.)