First, where do you get those estimates from ? I can only speak for myself, but I wouldn’t put figures on estimates when it doesn’t seem like even a consensus of experts would get any of those right, as they verge on questions more or less unsolved. Particularly 3 + 5, and 1 too, are going to yield totally speculative (subjective) estimates.
By that I mean that even if your estimate is as good as you can hope to make it, the real world event that you’d bet on will probably turn to be otherwise, like totally unexpected. Spread those confidence bounds. Orders of magnitude larger. Let’s admit we don’t know. Narrow, focused probability estimates must mean that you know better than you’d by chance, and for that to be true you ought to have good, practical reasons to believe you do.
Then, even if we don’t know, and even if the probability of cryonics working is very, very small … I’d expect that since your life is instrumentally necessary, for you to experience or do anything, that it supersedes in utility most if not all of the other things you value. You may think you love, I don’t know, chocolate more than life, but if you’re dead, you can’t pursue or value, even, that one. Likely so for anything else. Love, friendship, freedom, etc. Given that, shouldn’t you be ready to protect your prospects of survival as well as you can ? The only reason to avoid cryonics then, is if the cost of it, makes it impossible for you to invest in another strategy whose potential to sustain your life (indefinitely) is bigger than cryonics’. If not, and you can take on both, then no matter how low the probability of cryonics working, it’d still be rational to invest in it.
(Only other issue I can see with this outlook is that it has some potential to turn you into a money pump, as long as there’s a way to justify your bleeding of money no matter what the cost, to secure trillionth after trillionth of a chance to save your life, so long as you can still afford it, down to your las disposable penny.)
Yes, in some cases. Are those a minority, or a majority of the cases where you have to put your life into balance and decide if it’s worth being sacrificed for something, or having that thing sacrificed instead ?
It all hinges on your values too. Here it seemed like it was a given that one’s life was being considered as valuable, but that under some threshold, the probability of survival was too small to deserve a personal sacrifice (money, time, pleasure, energy, etc.). All of this from a personal standpoint, weighting personal, individual benefits, and individual costs. If all that is being considered is your own subjective enjoyment of life, then it still seems to me that any personal sacrifice is at most as undesirable as the loss of your life. And this calls into question, how much do we value our own personal enjoyment and life, when compared with other values such as other’s general well being ? In other words, how selfish and altruistic are we, both in which proportion, and in which cases ?
I wouldn’t put figures on estimates when it doesn’t seem like even a consensus of experts would get any of those right, as they verge on questions more or less unsolved.
You have to make a decision. Obviously, what the right decision is depends on what those probabilities are, so you have to factor in your beliefs (uncertain though they are) about them. What makes you think you can do that better without numbers than with?
I’d expect that [...] your life [...] supersedes in utility most if not all of the other things you value.
A reasonable expectation if you care only about yourself. Most of us care to some extent about other people. (In most cases, rather less than we like to think and say we do; but still, more than zero.) Any money you spend on cryonics is money that isn’t helping your family in the shorter term.
it has some potential to turn you into a money pump
Yes. There is a parallel objection to Pascal’s wager. (Though I think what it really is, in both cases, is not so much a reason to disagree as a sign that there’s likely something wrong with the logic. It could turn out that being money-pumped is the best one can do, after all.)
Yes, if we are to take a decision, we need the numbers. I wouldn’t say that a decision taken without numbers factored in, can be better, all else being equal. I’d say it can be worse, or equally good. Equally good if the numbers used are as good as numbers that would have been obtained through a random number generator. So even though numbers are ok, seeing a definite probability estimate put forward as if it was significantly better than a random luck guess, is a misuse I think.
When I see one of those, it makes me think of the other; in the absence of a particular reason, a detailed analysis, or a mechanism explaining “why”, then I might be tempted to think both estimates rely on the same rule of thumb.
“A news story about an Australian national lottery that was just starting up, interviewed a man on the street, asking him if he would play. He said yes. Then they asked him what he thought his odds were of winning. “Fifty-fifty,” he said, “either I win or I don’t.”
“The probability that human civilization will survive into the sufficiently far future (my estimate: 50%)”
As to the fact that an overwhelming majority of people don’t just care about themselves, I agree. Even avowed selfish people still ought to have (barring abnormal neurology) some mindware buried in that brain of theirs, that could cause them to care about others, and possibly even sacrifice their life for them.
First, where do you get those estimates from ? I can only speak for myself, but I wouldn’t put figures on estimates when it doesn’t seem like even a consensus of experts would get any of those right, as they verge on questions more or less unsolved. Particularly 3 + 5, and 1 too, are going to yield totally speculative (subjective) estimates.
By that I mean that even if your estimate is as good as you can hope to make it, the real world event that you’d bet on will probably turn to be otherwise, like totally unexpected. Spread those confidence bounds. Orders of magnitude larger. Let’s admit we don’t know. Narrow, focused probability estimates must mean that you know better than you’d by chance, and for that to be true you ought to have good, practical reasons to believe you do.
Then, even if we don’t know, and even if the probability of cryonics working is very, very small … I’d expect that since your life is instrumentally necessary, for you to experience or do anything, that it supersedes in utility most if not all of the other things you value. You may think you love, I don’t know, chocolate more than life, but if you’re dead, you can’t pursue or value, even, that one. Likely so for anything else. Love, friendship, freedom, etc. Given that, shouldn’t you be ready to protect your prospects of survival as well as you can ? The only reason to avoid cryonics then, is if the cost of it, makes it impossible for you to invest in another strategy whose potential to sustain your life (indefinitely) is bigger than cryonics’. If not, and you can take on both, then no matter how low the probability of cryonics working, it’d still be rational to invest in it.
(Only other issue I can see with this outlook is that it has some potential to turn you into a money pump, as long as there’s a way to justify your bleeding of money no matter what the cost, to secure trillionth after trillionth of a chance to save your life, so long as you can still afford it, down to your las disposable penny.)
Let us say you value freedom (not just for yourself but for friend/family/fellow countrymen). If you have a choice:
Option 1)Increased the amount of freedom, but killed you Option 2) Decreased the amount of freedom but allowed you to live,
Which would should you choose?
I’d pick 1, sometimes your own death is necessary to promote values even if you won’t be around to enjoy the benefits, e.g fighting against the nazis.
Yes, in some cases. Are those a minority, or a majority of the cases where you have to put your life into balance and decide if it’s worth being sacrificed for something, or having that thing sacrificed instead ?
It all hinges on your values too. Here it seemed like it was a given that one’s life was being considered as valuable, but that under some threshold, the probability of survival was too small to deserve a personal sacrifice (money, time, pleasure, energy, etc.). All of this from a personal standpoint, weighting personal, individual benefits, and individual costs. If all that is being considered is your own subjective enjoyment of life, then it still seems to me that any personal sacrifice is at most as undesirable as the loss of your life. And this calls into question, how much do we value our own personal enjoyment and life, when compared with other values such as other’s general well being ? In other words, how selfish and altruistic are we, both in which proportion, and in which cases ?
You have to make a decision. Obviously, what the right decision is depends on what those probabilities are, so you have to factor in your beliefs (uncertain though they are) about them. What makes you think you can do that better without numbers than with?
A reasonable expectation if you care only about yourself. Most of us care to some extent about other people. (In most cases, rather less than we like to think and say we do; but still, more than zero.) Any money you spend on cryonics is money that isn’t helping your family in the shorter term.
Yes. There is a parallel objection to Pascal’s wager. (Though I think what it really is, in both cases, is not so much a reason to disagree as a sign that there’s likely something wrong with the logic. It could turn out that being money-pumped is the best one can do, after all.)
Yes, if we are to take a decision, we need the numbers. I wouldn’t say that a decision taken without numbers factored in, can be better, all else being equal. I’d say it can be worse, or equally good. Equally good if the numbers used are as good as numbers that would have been obtained through a random number generator. So even though numbers are ok, seeing a definite probability estimate put forward as if it was significantly better than a random luck guess, is a misuse I think.
When I see one of those, it makes me think of the other; in the absence of a particular reason, a detailed analysis, or a mechanism explaining “why”, then I might be tempted to think both estimates rely on the same rule of thumb.
“A news story about an Australian national lottery that was just starting up, interviewed a man on the street, asking him if he would play. He said yes. Then they asked him what he thought his odds were of winning. “Fifty-fifty,” he said, “either I win or I don’t.”
“The probability that human civilization will survive into the sufficiently far future (my estimate: 50%)”
As to the fact that an overwhelming majority of people don’t just care about themselves, I agree. Even avowed selfish people still ought to have (barring abnormal neurology) some mindware buried in that brain of theirs, that could cause them to care about others, and possibly even sacrifice their life for them.