Equating high risk/high reward strategies with Pascal Wager is a way too common failure mode, and it’s helped by putting numbers on your estimates. How much is VERY TINY, how much do you think the best available options really cost, and how much would you be willing to pay (assuming you have that kind of money) for a 50% chance of living to 300 years?
To be clear, I’m not so much trying to convince you personally, as to get a generally better sense of the inferential distances involved.
I’d actually like to be convinced, but I suspect our priors differ by enough that it’s unlikely. I currently assign less than a 0.05% that I’ll live another 50 years (which would put me over 100), and three orders of magnitude less likely that I’ll live to 300. These are small enough that I don’t have as much precision in my beliefs as that implies, of course.
Conditional on significant lifestyle changes, I can probably raise those chances by 10x, from vanishingly unlikely to … vanishingly unlikely. Conditional on more money than I’m likely to have (which is already in the top few percent of humanity), maybe another 3x.
I don’t believe there are any tradeoffs I can make which would give me a 50% chance to live to 300 years.
That’s, like, 99.95% probability, one in two thousand chances. You’d have two orders of magnitude higher chances of survival if you were to literally shoot yourself with a literal gun. I’m not sure you can forecast anything at all (about humans or technologies) with this degree of certainty decades into the future, definitely not that every single one of dozens attempts in a technology you’re not an expert in fail and every single one of hundreds attempts in another technology you’re not an expert in fail (building aligned AGI).
I don’t believe there are any tradeoffs I can make which would give me a 50% chance to live to 300 years.
I don’t believe it either, it’s a thought experiment, I assumed it’d be obvious since it’s a very common technique to estimate how much one should value low probabilities.
I think we’ve found at least one important crux, I’m going to bow out now. I realize I misspoke earlier—I don’t much care if I become convinced, but I very much hope you succeed in keeping me and you and others alive much longer.
Equating high risk/high reward strategies with Pascal Wager is a way too common failure mode, and it’s helped by putting numbers on your estimates. How much is VERY TINY, how much do you think the best available options really cost, and how much would you be willing to pay (assuming you have that kind of money) for a 50% chance of living to 300 years?
To be clear, I’m not so much trying to convince you personally, as to get a generally better sense of the inferential distances involved.
I’d actually like to be convinced, but I suspect our priors differ by enough that it’s unlikely. I currently assign less than a 0.05% that I’ll live another 50 years (which would put me over 100), and three orders of magnitude less likely that I’ll live to 300. These are small enough that I don’t have as much precision in my beliefs as that implies, of course.
Conditional on significant lifestyle changes, I can probably raise those chances by 10x, from vanishingly unlikely to … vanishingly unlikely. Conditional on more money than I’m likely to have (which is already in the top few percent of humanity), maybe another 3x.
I don’t believe there are any tradeoffs I can make which would give me a 50% chance to live to 300 years.
That’s, like, 99.95% probability, one in two thousand chances. You’d have two orders of magnitude higher chances of survival if you were to literally shoot yourself with a literal gun. I’m not sure you can forecast anything at all (about humans or technologies) with this degree of certainty decades into the future, definitely not that every single one of dozens attempts in a technology you’re not an expert in fail and every single one of hundreds attempts in another technology you’re not an expert in fail (building aligned AGI).
I don’t believe it either, it’s a thought experiment, I assumed it’d be obvious since it’s a very common technique to estimate how much one should value low probabilities.
I think we’ve found at least one important crux, I’m going to bow out now. I realize I misspoke earlier—I don’t much care if I become convinced, but I very much hope you succeed in keeping me and you and others alive much longer.