If asked independently whether or not I would take an eyeball speck in the eye to spare a stranger 50 years of torture, i would say “sure”. I suspect most people would if asked independently. It should make no difference to each of those 3^^^3 dust speck victims that there are another (3^^^3)-1 people that would also take the dust speck if asked.
It seems then that there are thresholds in human value. Human value might be better modeled by sureals than reals. In such a system we could represent the utility of 50 years of torture as -Ω and represent the utility of a dust speck in one’s eye as −1. This way, no matter how many dust specks end up in eyes, they don’t add up to torturing someone for 50 years. However we would still minimize torture, and minimize dust specks.
The greater problem is to exhibit a general procedure for when we should treat one fate as being infinitely worse than another, vs. treating it as merely being some finite amount worse.
That’s a fairly manipulative way of asking you to make that decision, though. If I were asked whether or not I would take a hard punch in the arm to spare a stranger a broken bone, I would answer “sure”, and I suspect most people would, as well. However, it is pretty much clear to me that 3^^^3 people getting punched is much much worse than one person breaking a bone.
It should make no difference to each of those 3^^^3 dust speck victims that there are another (3^^^3)-1 people that would also take the dust speck if asked.
That rests on the assumption that each person only cares about their own dust speck and the possible torture victim. If people are allowed to care about the aggregate quantity of suffering, then this choice might represent an Abilene paradox.
Here’s a suggestion: if someone going through a fate A, is incapable of noticing whether or not they’re going through fate B, then fate A is infinitely worse than fate B.
If asked independently whether or not I would take an eyeball speck in the eye to spare a stranger 50 years of torture, i would say “sure”. I suspect most people would if asked independently. It should make no difference to each of those 3^^^3 dust speck victims that there are another (3^^^3)-1 people that would also take the dust speck if asked.
It seems then that there are thresholds in human value. Human value might be better modeled by sureals than reals. In such a system we could represent the utility of 50 years of torture as -Ω and represent the utility of a dust speck in one’s eye as −1. This way, no matter how many dust specks end up in eyes, they don’t add up to torturing someone for 50 years. However we would still minimize torture, and minimize dust specks.
The greater problem is to exhibit a general procedure for when we should treat one fate as being infinitely worse than another, vs. treating it as merely being some finite amount worse.
That’s a fairly manipulative way of asking you to make that decision, though. If I were asked whether or not I would take a hard punch in the arm to spare a stranger a broken bone, I would answer “sure”, and I suspect most people would, as well. However, it is pretty much clear to me that 3^^^3 people getting punched is much much worse than one person breaking a bone.
That rests on the assumption that each person only cares about their own dust speck and the possible torture victim. If people are allowed to care about the aggregate quantity of suffering, then this choice might represent an Abilene paradox.
Here’s a suggestion: if someone going through a fate A, is incapable of noticing whether or not they’re going through fate B, then fate A is infinitely worse than fate B.