Imagine you’ve found yourself with an incurable disease and 3 years to live.
This is an obvious and common enough analogy that you don’t need to frame it as a thought experiment. I understand that I have an incurable disease. It’s longer than 3 years, I hope, but not by much more than an order of magnitude, certainly nowhere near 2. I’m not even doing everything I could in terms of lifestyle, exercise, and nutrition to extend it, let alone “experimental” cures. It’s not infectious, fortunately—everyone already has it.
Friends I’ve lost to disease, accident, or suicide ALSO didn’t universally commit to “experimental cures”—in all cases I know of, the cost (non-monetary cost of side-effects more than pure money, but some of that too) of the long-shots were higher than their perceived success rate.
As Pascal’s Wager options go, giving up significant resources or happiness over the next decade for a VERY TINY chance of living longer, seems to be among the less compelling formulations.
Equating high risk/high reward strategies with Pascal Wager is a way too common failure mode, and it’s helped by putting numbers on your estimates. How much is VERY TINY, how much do you think the best available options really cost, and how much would you be willing to pay (assuming you have that kind of money) for a 50% chance of living to 300 years?
To be clear, I’m not so much trying to convince you personally, as to get a generally better sense of the inferential distances involved.
I’d actually like to be convinced, but I suspect our priors differ by enough that it’s unlikely. I currently assign less than a 0.05% that I’ll live another 50 years (which would put me over 100), and three orders of magnitude less likely that I’ll live to 300. These are small enough that I don’t have as much precision in my beliefs as that implies, of course.
Conditional on significant lifestyle changes, I can probably raise those chances by 10x, from vanishingly unlikely to … vanishingly unlikely. Conditional on more money than I’m likely to have (which is already in the top few percent of humanity), maybe another 3x.
I don’t believe there are any tradeoffs I can make which would give me a 50% chance to live to 300 years.
That’s, like, 99.95% probability, one in two thousand chances. You’d have two orders of magnitude higher chances of survival if you were to literally shoot yourself with a literal gun. I’m not sure you can forecast anything at all (about humans or technologies) with this degree of certainty decades into the future, definitely not that every single one of dozens attempts in a technology you’re not an expert in fail and every single one of hundreds attempts in another technology you’re not an expert in fail (building aligned AGI).
I don’t believe there are any tradeoffs I can make which would give me a 50% chance to live to 300 years.
I don’t believe it either, it’s a thought experiment, I assumed it’d be obvious since it’s a very common technique to estimate how much one should value low probabilities.
I think we’ve found at least one important crux, I’m going to bow out now. I realize I misspoke earlier—I don’t much care if I become convinced, but I very much hope you succeed in keeping me and you and others alive much longer.
This is an obvious and common enough analogy that you don’t need to frame it as a thought experiment. I understand that I have an incurable disease. It’s longer than 3 years, I hope, but not by much more than an order of magnitude, certainly nowhere near 2. I’m not even doing everything I could in terms of lifestyle, exercise, and nutrition to extend it, let alone “experimental” cures. It’s not infectious, fortunately—everyone already has it.
Friends I’ve lost to disease, accident, or suicide ALSO didn’t universally commit to “experimental cures”—in all cases I know of, the cost (non-monetary cost of side-effects more than pure money, but some of that too) of the long-shots were higher than their perceived success rate.
As Pascal’s Wager options go, giving up significant resources or happiness over the next decade for a VERY TINY chance of living longer, seems to be among the less compelling formulations.
Equating high risk/high reward strategies with Pascal Wager is a way too common failure mode, and it’s helped by putting numbers on your estimates. How much is VERY TINY, how much do you think the best available options really cost, and how much would you be willing to pay (assuming you have that kind of money) for a 50% chance of living to 300 years?
To be clear, I’m not so much trying to convince you personally, as to get a generally better sense of the inferential distances involved.
I’d actually like to be convinced, but I suspect our priors differ by enough that it’s unlikely. I currently assign less than a 0.05% that I’ll live another 50 years (which would put me over 100), and three orders of magnitude less likely that I’ll live to 300. These are small enough that I don’t have as much precision in my beliefs as that implies, of course.
Conditional on significant lifestyle changes, I can probably raise those chances by 10x, from vanishingly unlikely to … vanishingly unlikely. Conditional on more money than I’m likely to have (which is already in the top few percent of humanity), maybe another 3x.
I don’t believe there are any tradeoffs I can make which would give me a 50% chance to live to 300 years.
That’s, like, 99.95% probability, one in two thousand chances. You’d have two orders of magnitude higher chances of survival if you were to literally shoot yourself with a literal gun. I’m not sure you can forecast anything at all (about humans or technologies) with this degree of certainty decades into the future, definitely not that every single one of dozens attempts in a technology you’re not an expert in fail and every single one of hundreds attempts in another technology you’re not an expert in fail (building aligned AGI).
I don’t believe it either, it’s a thought experiment, I assumed it’d be obvious since it’s a very common technique to estimate how much one should value low probabilities.
I think we’ve found at least one important crux, I’m going to bow out now. I realize I misspoke earlier—I don’t much care if I become convinced, but I very much hope you succeed in keeping me and you and others alive much longer.