I don’t think decreasing existential risk falls into it, because the probability of an existential catastrophe isn’t extremely small. One survey taken at Oxford predicted that there was a ~19% chance of human extinction prior to 2100. Determining the probability of existential catastrophe is very challenging and the aforementioned statistic should be viewed skeptically, but a probability anywhere near 19% would still (as far as I can tell) prevent to from falling prey to Pascal’s mugging.
But your earlier quote says that it makes sense to reduce risk by a millionth of a percentage point because the expected value of lives saved is still large. It doesn’t propose reducing the risk from 19% to nothing; it proposes reducing the risk by a tiny amount. Only in the unlikely event that that tiny change happens to be the tipping point that prevents extinction would this reduction be beneficial; the expected value is derived by multiplying this unlikelihood by the large number of lives saved were it to be true. That sounds like Pascal’s Mugging. I agree that it wouldn’t be Pascal’s Mugging to reduce the 19% to 0, but I think that reducing it to 18.999999% is.
I see what you mean. I don’t really know enough about Pascal’s mugging to determine whether decreasing existential risk be 1 millionth of a percent is worth it, but it’s a moot point, as it seems reasonable that existential risk could be reduced by far more than 1 millionth of one percent.
Doesn’t that fall prey to Pascal’s Mugging?
I don’t think decreasing existential risk falls into it, because the probability of an existential catastrophe isn’t extremely small. One survey taken at Oxford predicted that there was a ~19% chance of human extinction prior to 2100. Determining the probability of existential catastrophe is very challenging and the aforementioned statistic should be viewed skeptically, but a probability anywhere near 19% would still (as far as I can tell) prevent to from falling prey to Pascal’s mugging.
But your earlier quote says that it makes sense to reduce risk by a millionth of a percentage point because the expected value of lives saved is still large. It doesn’t propose reducing the risk from 19% to nothing; it proposes reducing the risk by a tiny amount. Only in the unlikely event that that tiny change happens to be the tipping point that prevents extinction would this reduction be beneficial; the expected value is derived by multiplying this unlikelihood by the large number of lives saved were it to be true. That sounds like Pascal’s Mugging. I agree that it wouldn’t be Pascal’s Mugging to reduce the 19% to 0, but I think that reducing it to 18.999999% is.
I see what you mean. I don’t really know enough about Pascal’s mugging to determine whether decreasing existential risk be 1 millionth of a percent is worth it, but it’s a moot point, as it seems reasonable that existential risk could be reduced by far more than 1 millionth of one percent.