I’m usually fine with dropping a one-time probability of 0.1% from my calculations. 10% is much too high to drop from a major strategic calculation but even so I’d be uncomfortable building my life around one. If this was a very well-defined number as in the asteroid calculation then it would be more tempting to build a big reference class of risks like that one and work on stopping them collectively. If an asteroid were genuinely en route, large enough to wipe out humanity, possibly stoppable, and nobody was doing anything about this 10% probability, I would still be working on FAI but I would be screaming pretty loudly about the asteroid on the side. If the asteroid is just going to wipe out a country, I’ll make sure I’m not in that country and then keep working on x-risk.
I didn’t like his anecdote, either.
I think you’ve read him wrong. He’s opposed to “don’t pay attention to high utility * small probability scenarios”, on the basis of heroism.
I’m usually fine with dropping a one-time probability of 0.1% from my calculations. 10% is much too high to drop from a major strategic calculation but even so I’d be uncomfortable building my life around one. If this was a very well-defined number as in the asteroid calculation then it would be more tempting to build a big reference class of risks like that one and work on stopping them collectively. If an asteroid were genuinely en route, large enough to wipe out humanity, possibly stoppable, and nobody was doing anything about this 10% probability, I would still be working on FAI but I would be screaming pretty loudly about the asteroid on the side. If the asteroid is just going to wipe out a country, I’ll make sure I’m not in that country and then keep working on x-risk.