I’m not sure whether this is worth a whole post on its own or not, but I’ve been wondering about X-risk, particularly the utilitarian/EA justification of donating to charities dedicated to reducing its risk, and how it seems similar to Pascal’s mugging. Perhaps I’m just rationalising my unwillingness to donate to charities with hazier benefits than mosquito nets, but I’d like to discuss it with the community.
As I understand it, in Pascal’s mugging, Pascal is walking down the street when a mugger comes up to him and says “Give me £100 or I’ll torture 3^^^3 consciousnesses in the simulation I have in my pocket”. However unlikely Pascal thinks it is that the mugger has such a pocket simulation, the amount of suffering is high enough that it swamps the doubt and Pascal is morally compelled (whatever that means) to hand over the money.
I’m aware that many people reject this in some way. But many people also donate to x-risk charities, and I don’t see how you can consistently tell Pascal not to give the mugger his money and simultaneously allow yourself to be “mugged” by the x-risk charities.
Most people who donate to Xrisk charities consider the probability a lot higher than a person saying “Give me £100 or I’ll torture 3^^^3 consciousnesses in the simulation I have in my pocket” to speak the truth.
Pascal’s mugging applies when the sheer magnitude of the benefit is a substitute for any argument that the benefit is actually likely. X-risk charities typically are of the form, “This probability is small but nonvanishing because Y, and Z. When you multiply it out, the big benefit makes the expected gain large despite the attenuating factor”. Some go further and argue that the probability is not small. None simply rely on the magnitude of the benefit without also arguing for plausibility.
I’m not sure whether this is worth a whole post on its own or not, but I’ve been wondering about X-risk, particularly the utilitarian/EA justification of donating to charities dedicated to reducing its risk, and how it seems similar to Pascal’s mugging. Perhaps I’m just rationalising my unwillingness to donate to charities with hazier benefits than mosquito nets, but I’d like to discuss it with the community.
As I understand it, in Pascal’s mugging, Pascal is walking down the street when a mugger comes up to him and says “Give me £100 or I’ll torture 3^^^3 consciousnesses in the simulation I have in my pocket”. However unlikely Pascal thinks it is that the mugger has such a pocket simulation, the amount of suffering is high enough that it swamps the doubt and Pascal is morally compelled (whatever that means) to hand over the money.
I’m aware that many people reject this in some way. But many people also donate to x-risk charities, and I don’t see how you can consistently tell Pascal not to give the mugger his money and simultaneously allow yourself to be “mugged” by the x-risk charities.
Most people who donate to Xrisk charities consider the probability a lot higher than a person saying “Give me £100 or I’ll torture 3^^^3 consciousnesses in the simulation I have in my pocket” to speak the truth.
Pascal’s mugging applies when the sheer magnitude of the benefit is a substitute for any argument that the benefit is actually likely. X-risk charities typically are of the form, “This probability is small but nonvanishing because Y, and Z. When you multiply it out, the big benefit makes the expected gain large despite the attenuating factor”. Some go further and argue that the probability is not small. None simply rely on the magnitude of the benefit without also arguing for plausibility.
http://www.vox.com/2015/8/10/9124145/effective-altruism-global-ai