But your earlier quote says that it makes sense to reduce risk by a millionth of a percentage point because the expected value of lives saved is still large. It doesn’t propose reducing the risk from 19% to nothing; it proposes reducing the risk by a tiny amount. Only in the unlikely event that that tiny change happens to be the tipping point that prevents extinction would this reduction be beneficial; the expected value is derived by multiplying this unlikelihood by the large number of lives saved were it to be true. That sounds like Pascal’s Mugging. I agree that it wouldn’t be Pascal’s Mugging to reduce the 19% to 0, but I think that reducing it to 18.999999% is.
I see what you mean. I don’t really know enough about Pascal’s mugging to determine whether decreasing existential risk be 1 millionth of a percent is worth it, but it’s a moot point, as it seems reasonable that existential risk could be reduced by far more than 1 millionth of one percent.
But your earlier quote says that it makes sense to reduce risk by a millionth of a percentage point because the expected value of lives saved is still large. It doesn’t propose reducing the risk from 19% to nothing; it proposes reducing the risk by a tiny amount. Only in the unlikely event that that tiny change happens to be the tipping point that prevents extinction would this reduction be beneficial; the expected value is derived by multiplying this unlikelihood by the large number of lives saved were it to be true. That sounds like Pascal’s Mugging. I agree that it wouldn’t be Pascal’s Mugging to reduce the 19% to 0, but I think that reducing it to 18.999999% is.
I see what you mean. I don’t really know enough about Pascal’s mugging to determine whether decreasing existential risk be 1 millionth of a percent is worth it, but it’s a moot point, as it seems reasonable that existential risk could be reduced by far more than 1 millionth of one percent.