Let’s suppose we measure pain in pain points (pp). Any event which can cause pain is given a value in [0, 1], with 0 being no pain and 1 being the maximum amount of pain perceivable. To calculate the pp of an event, assign a value to the pain, say p, and then multiply it by the number of people who will experience the pain, n. So for the torture case, assume p = 1, then:
torture: 1*1 = 1 pp
For the spec in eye case, suppose it causes the least amount of pain greater than no pain possible. Denote this by e. Assume that the dust speck causes e amount of pain. Then if e < 1/3^^^3
spec: 1 * e < 1 pp
and if e > 1/3^^^3
spec: 1 * e > 1 pp
So assuming our moral calculus is to always choose whichever option generates the least pp, we need only ask if e is greater than or less than 1/n.
If you’ve been paying attention, I now have an out to give no answer: we don’t know what e is, so I can’t decide (at least not based on pp). But I’ll go ahead and wager a guess. Since 1/3^^^3 is very small, I think that most likely any pain sensing system of any present or future intelligence will have e > 1/3^^^3, then I must choose torture because torture costs 1 pp but the specs cost more than 1 pp.
This doesn’t feel like what, as a human, I would expect the answer to be. I want to say don’t torture the poor guy and all the rest of us will suffer the spec so he need not be tortured. But I suspect this is human inability to deal with large numbers, because I think about how I would be willing to accept a spec so the guy wouldn’t be torture since e pp < 1 pp, and every other individual, supposing they were pp-fearing people, would make the same short-sighted choice. But the net cost would be to distribute more pain with the specs than the torture ever would.
Weird how the human mind can find a logical answer and still expect a nonlogical answer to be the truth.
Let’s suppose we measure pain in pain points (pp). Any event which can cause pain is given a value in [0, 1], with 0 being no pain and 1 being the maximum amount of pain perceivable. To calculate the pp of an event, assign a value to the pain, say p, and then multiply it by the number of people who will experience the pain, n. So for the torture case, assume p = 1, then:
torture: 1*1 = 1 pp
For the spec in eye case, suppose it causes the least amount of pain greater than no pain possible. Denote this by e. Assume that the dust speck causes e amount of pain. Then if e < 1/3^^^3
spec: 1 * e < 1 pp
and if e > 1/3^^^3
spec: 1 * e > 1 pp
So assuming our moral calculus is to always choose whichever option generates the least pp, we need only ask if e is greater than or less than 1/n.
If you’ve been paying attention, I now have an out to give no answer: we don’t know what e is, so I can’t decide (at least not based on pp). But I’ll go ahead and wager a guess. Since 1/3^^^3 is very small, I think that most likely any pain sensing system of any present or future intelligence will have e > 1/3^^^3, then I must choose torture because torture costs 1 pp but the specs cost more than 1 pp.
This doesn’t feel like what, as a human, I would expect the answer to be. I want to say don’t torture the poor guy and all the rest of us will suffer the spec so he need not be tortured. But I suspect this is human inability to deal with large numbers, because I think about how I would be willing to accept a spec so the guy wouldn’t be torture since e pp < 1 pp, and every other individual, supposing they were pp-fearing people, would make the same short-sighted choice. But the net cost would be to distribute more pain with the specs than the torture ever would.
Weird how the human mind can find a logical answer and still expect a nonlogical answer to be the truth.