if you were in a burning building, you would try pretty hard to get out. Therefore, you must strongly dislike death and want to avoid it. But if you strongly dislike death and want to avoid it, you must be lying when you say you accept death as a natural part of life and think it’s crass and selfish to try to cheat the Reaper. And therefore your reluctance to sign up for cryonics violates your own revealed preferences! You must just be trying to signal conformity or something.
I don’t think this section bolsters your point much. The obvious explanation for this behaviour, to me, is the utility functions for each situation.
For the fire: Expected Utility = p(longer life | Leaving fire) * Utility(longer life) - Cost(Running)
For cryonics: Expected Utility = p(longer life | Signing up for cryonics) * Utility(longer life) - Cost(Cryonics)
It’s pretty safe to assume that almost everyone assigns a value almost equal to one for p(longer life | Leaving fire), and a value that is relatively insignificant to Cost(Running) which would mainly be temporary exhaustion. But those aren’t necessarily valid assumptions in the case for cryonics. Even the most ardent supporter of cryonics is unlikely to assign a probability as large as that of the fire. And the monetary costs are quite significant, especially to some demographics.
That’s a good question. I didn’t really think about it when I read it, because I am personally completely dismissive of and not scared by haunted houses, whereas I am skeptical of cryonics, and couldn’t afford it even if I did the research and decided it was worth it.
I’m not sure it can be, but I’m not sure a true rationalist would be scared by a haunted house. The only thing I can come up with for a rational utility function is someone who suspended his belief because he enjoyed being scared. I feel like this example is far more related to irrationality and innate, irrepressible bias than it is rationality.
I don’t think this section bolsters your point much. The obvious explanation for this behaviour, to me, is the utility functions for each situation.
For the fire: Expected Utility = p(longer life | Leaving fire) * Utility(longer life) - Cost(Running)
For cryonics: Expected Utility = p(longer life | Signing up for cryonics) * Utility(longer life) - Cost(Cryonics)
It’s pretty safe to assume that almost everyone assigns a value almost equal to one for p(longer life | Leaving fire), and a value that is relatively insignificant to Cost(Running) which would mainly be temporary exhaustion. But those aren’t necessarily valid assumptions in the case for cryonics. Even the most ardent supporter of cryonics is unlikely to assign a probability as large as that of the fire. And the monetary costs are quite significant, especially to some demographics.
-
That’s a good question. I didn’t really think about it when I read it, because I am personally completely dismissive of and not scared by haunted houses, whereas I am skeptical of cryonics, and couldn’t afford it even if I did the research and decided it was worth it.
I’m not sure it can be, but I’m not sure a true rationalist would be scared by a haunted house. The only thing I can come up with for a rational utility function is someone who suspended his belief because he enjoyed being scared. I feel like this example is far more related to irrationality and innate, irrepressible bias than it is rationality.