Multifoliaterose said his result held even for donors who took Astronomical Waste seriously. This seems unlikely to be the case.
Edit: I didn’t vote you down, but what? SIAI is an Existential Risk charity: the point is to save the entire human race, at low probability. Of course the expected value is going to be calculated by multiplying a tiny probability by an enormous value!
This isn’t true for all existential risks. For example, fears about nuclear war or global warming don’t rely on such tiny probabilities. But discussions of many other risks remind me of this xkcd.
The usual consequentialist case for charity to reduce risks of nuclear war or catastrophic global warming feedbacks does rely on tiny probabilities of donations making a difference. Likwise for voting and for many kinds of scientific and medical research charity.
Edit: not as tiny as in Lark’s comment, although the numbers there are incredibly low.
If the Pascal’s mugging issue is with exerting a small force on a big thing and therefore having a small probability of succeeding, I don’t think that’s even a coherent objection; in a chaotic universe, anything you do may save or destroy individual people and human civilization by redrawing from the distribution, and shifting the distribution of the number of saved lives up by one through e.g. aid or buying someone cryo doesn’t seem fundamentally different from e.g. shifting the distribution of election outcomes. This is more clearly true if you believe in a multiverse such as MWI, but I don’t think it requires that.
ETA: if your distribution on people killed is uniform for say 6 through 10, then definitely saving one person and having a 1 in 5 chance of turning a 10-death situation into a 5-death situation are the same thing except for maybe the identity of the victims, right?
Multifoliaterose said his result held even for donors who took Astronomical Waste seriously. This seems unlikely to be the case.
Edit: I didn’t vote you down, but what? SIAI is an Existential Risk charity: the point is to save the entire human race, at low probability. Of course the expected value is going to be calculated by multiplying a tiny probability by an enormous value!
This isn’t true for all existential risks. For example, fears about nuclear war or global warming don’t rely on such tiny probabilities. But discussions of many other risks remind me of this xkcd.
The usual consequentialist case for charity to reduce risks of nuclear war or catastrophic global warming feedbacks does rely on tiny probabilities of donations making a difference. Likwise for voting and for many kinds of scientific and medical research charity.
Edit: not as tiny as in Lark’s comment, although the numbers there are incredibly low.
If the Pascal’s mugging issue is with exerting a small force on a big thing and therefore having a small probability of succeeding, I don’t think that’s even a coherent objection; in a chaotic universe, anything you do may save or destroy individual people and human civilization by redrawing from the distribution, and shifting the distribution of the number of saved lives up by one through e.g. aid or buying someone cryo doesn’t seem fundamentally different from e.g. shifting the distribution of election outcomes. This is more clearly true if you believe in a multiverse such as MWI, but I don’t think it requires that.
ETA: if your distribution on people killed is uniform for say 6 through 10, then definitely saving one person and having a 1 in 5 chance of turning a 10-death situation into a 5-death situation are the same thing except for maybe the identity of the victims, right?
That’s a very different value of “tiny” than the one in Larks’s comment.