Expanding conjunctive probabilities without expanding disjunctive probabilities is another classic form of one-sided rationality. If I wanted to make cryonics look more probable than this, I would individually list out many different things that could go right.
In this case, I’m not seeing the disjunctive possibilities that lead one to sign up for cryo in this particular case. ABCD seem to be phrased pretty broadly, and A and B in particular are already pretty big. Do you mean as an alternate to D that, say, a new cryo provider takes over the abandoned preserved heads before they thaw? Or as an alternate to C, that even though the cost is high, they go ahead and do it anyway?
Beyond that, I only see scenarios that are nice but don’t point one toward cryopreservation. Like, time travel scans of dying people meaning no one ever really died is wonderful, but it means getting cryopreserved only did good in that your family wouldn’t be QUITE as sad you were gone in the time before they ‘died’.
Do you mean as an alternate to D that, say, a new cryo provider takes over the abandoned preserved heads before they thaw?
Sure. That happened already once in history (though there was, even earlier, a loss-thaw). It’s why all modern cryo organizations are very strict about demanding advance payment, despite their compassionate hearts screaming at them not to let their friends die because of mere money. Sucks to be them, but they’ve got no choice.
Or as an alternate to C, that even though the cost is high, they go ahead and do it anyway?
Yep. I’d think FAI scenarios would tend to yield that.
Basically I always sigh sadly when somebody’s discussing a future possibility and they throw up some random conjunction of conditional probabilities, many steps of which are actually pretty darned high when I look at them, with no corresponding disjunctions listed. This is the sort of thinking that would’ve led Fermi to cleverly assign a probability way lower than 10% to having an impact, by the time he was done expanding all the clever steps of the form “And then we can actually persuade the military to pay attention to us...” If you’re going to be silly about driving down all impact probabilities to something small via this sort of conjunctive cleverness, you’d better also be silly and multiply the resulting small probability by a large payoff, so you won’t actually ignore all possible important issues.
“And then we can actually persuade the military to pay attention to us...”
The government did sit on it for quite a while, delaying the bomb until after the defeat of Germany. Nudges from Britain were important in getting things moving.
Expanding conjunctive probabilities without expanding disjunctive probabilities is another classic form of one-sided rationality. If I wanted to make cryonics look more probable than this, I would individually list out many different things that could go right.
For the purpose of establishing that it’s not a Pascalian probability, it suffices to talk about a lower bound on the main line of reasoning.
Ah, I see that I said “estimate” instead of “lower bound” in the critical place. I’ll edit.
Can you give a few examples?
In this case, I’m not seeing the disjunctive possibilities that lead one to sign up for cryo in this particular case. ABCD seem to be phrased pretty broadly, and A and B in particular are already pretty big. Do you mean as an alternate to D that, say, a new cryo provider takes over the abandoned preserved heads before they thaw? Or as an alternate to C, that even though the cost is high, they go ahead and do it anyway?
Beyond that, I only see scenarios that are nice but don’t point one toward cryopreservation. Like, time travel scans of dying people meaning no one ever really died is wonderful, but it means getting cryopreserved only did good in that your family wouldn’t be QUITE as sad you were gone in the time before they ‘died’.
Sure. That happened already once in history (though there was, even earlier, a loss-thaw). It’s why all modern cryo organizations are very strict about demanding advance payment, despite their compassionate hearts screaming at them not to let their friends die because of mere money. Sucks to be them, but they’ve got no choice.
Yep. I’d think FAI scenarios would tend to yield that.
Basically I always sigh sadly when somebody’s discussing a future possibility and they throw up some random conjunction of conditional probabilities, many steps of which are actually pretty darned high when I look at them, with no corresponding disjunctions listed. This is the sort of thinking that would’ve led Fermi to cleverly assign a probability way lower than 10% to having an impact, by the time he was done expanding all the clever steps of the form “And then we can actually persuade the military to pay attention to us...” If you’re going to be silly about driving down all impact probabilities to something small via this sort of conjunctive cleverness, you’d better also be silly and multiply the resulting small probability by a large payoff, so you won’t actually ignore all possible important issues.
The government did sit on it for quite a while, delaying the bomb until after the defeat of Germany. Nudges from Britain were important in getting things moving.
The military “paid attention to them” long before that though.