This proposal seems like taking advantage of something like the conjunction fallacy. The conjunction fallacy is where you assign a higher probability to more specific conditions than you do to more general conditions. I think that happens because people are more easily able to construct a story with more specific facts. Here, I think the fallacy is placing greater value on a future benefit that is specific than one does on the entire class of future benefits that includes the specific one. “I want to be alive in 2114 so that I can read that great Atwood book” is logically less valuable than “I want to be alive in 2114 so that I can experience all of the valuable things that I chose to experience then.” But it is a whole lot easier to anticipate and set a value on reading that one book. Or, put another way, it is harder to get lost before coming to a conclusion when looking at the specific.
But maybe I am wrong about the basis of the conjunction fallacy. I couldn’t find a name for the specific cognitive bias I think I see here. Nor could I find a name for the more general class that would include both the conjunction fallacy and this thing. I assign a high probability to me missing relevant information simply because I’m ignorant of where to look.
To net all of that out, I am thinking that it would be an effective tactic to market surviving into the distant future by promising very specific benefits.
I don’t think it’s a logical fallacy at all. I mean, anyone who changes their mind about cryonics because of the promise of future Margaret Atwood is probably not being very rational, but formally there’s nothing wrong with that reasoning.
I’m an Atwood-reading robot. I exist only to read every Margaret Atwood novel. I expect to outlive her, so the future holds nothing of value to me. No need for cryonics. Oh but what’s this? A secret Atwood novel to be released in 2114? Sign me up! I’ll go back to suicidal apathy after I’ve read the 2114 novel.
This proposal seems like taking advantage of something like the conjunction fallacy. The conjunction fallacy is where you assign a higher probability to more specific conditions than you do to more general conditions. I think that happens because people are more easily able to construct a story with more specific facts. Here, I think the fallacy is placing greater value on a future benefit that is specific than one does on the entire class of future benefits that includes the specific one. “I want to be alive in 2114 so that I can read that great Atwood book” is logically less valuable than “I want to be alive in 2114 so that I can experience all of the valuable things that I chose to experience then.” But it is a whole lot easier to anticipate and set a value on reading that one book. Or, put another way, it is harder to get lost before coming to a conclusion when looking at the specific.
But maybe I am wrong about the basis of the conjunction fallacy. I couldn’t find a name for the specific cognitive bias I think I see here. Nor could I find a name for the more general class that would include both the conjunction fallacy and this thing. I assign a high probability to me missing relevant information simply because I’m ignorant of where to look.
To net all of that out, I am thinking that it would be an effective tactic to market surviving into the distant future by promising very specific benefits.
Max L.
I don’t think it’s a logical fallacy at all. I mean, anyone who changes their mind about cryonics because of the promise of future Margaret Atwood is probably not being very rational, but formally there’s nothing wrong with that reasoning.