Why would they all be at pain number zero? I’d expect them to be randomly distributed in all their traits unless specified otherwise. If I give them a mean pain of zero and a standard deviation of 1, there’d be no shortage of people with a pain level of 1,000,000,000,000. The same goes with any reasonable distribtion.
If you play around with my paradox a bit more, you can work out that if you have 1,000,000,000,000 people at pain level n, and one person at pain level zero, there must be some n between 0 and 999,999,999,999 such that it’s at least as bad to torture the one person as to give the rest dust specks.
Where is the marginal disutility like that? If you have 1,000,000,000 people at pain 999,999,999,999, and one at pain 0, would you rather torture the one, or give the 1,000,000,000,000 dust specks?
they haven’t reached my threshold for caring about the pain they are experiencing
So, are you saying that there’s a threshold x, such that any amount of pain less than x doesn’t matter? This would mean that increasing it from x-1 to x for 3^^^3 people would do nothing, but increasing it from x to x+1 would be horrible? Put another way, you have 3^^^3 people at a pain level of x-1, and you give them all one dust speck. This doesn’t matter. If you give them a second dust speck, now it’s an unimaginable atrocity.
I would expect a cutoff like this would be an approximation. You’d actually think that the marginal disutility of pain starts out at zero, and steadily increases until it approaches one. If this were true, one dust speck would bring the pain to 1, which would make the marginal disutility slightly above zero, so that would have some tiny amount of badness. If you multiply it by 3^^^3, now it’s unimaginable.
Why would they all be at pain number zero? I’d expect them to be randomly distributed in all their traits unless specified otherwise. If I give them a mean pain of zero and a standard deviation of 1, there’d be no shortage of people with a pain level of 1,000,000,000,000. The same goes with any reasonable distribtion.
It’s a thought experiment. The whole scenario is utterly far-fetched, so there’s no use in arguing that this or that detail of the thought experiment is what we should “expect” to find.
As such, I choose the version of the thought experiment that best teases out the dilemma that Yudkowsky is trying to explore, which concerns the question of whether we should consider pain to be denominated all in the same units—i.e. 3^^^3 x miniscule pain > 1 x torture—in our moral calculations.
EDIT: in response to the rest of your comment, see my reply to “Unnamed”.
Why would they all be at pain number zero? I’d expect them to be randomly distributed in all their traits unless specified otherwise. If I give them a mean pain of zero and a standard deviation of 1, there’d be no shortage of people with a pain level of 1,000,000,000,000. The same goes with any reasonable distribtion.
If you play around with my paradox a bit more, you can work out that if you have 1,000,000,000,000 people at pain level n, and one person at pain level zero, there must be some n between 0 and 999,999,999,999 such that it’s at least as bad to torture the one person as to give the rest dust specks.
Where is the marginal disutility like that? If you have 1,000,000,000 people at pain 999,999,999,999, and one at pain 0, would you rather torture the one, or give the 1,000,000,000,000 dust specks?
I would expect a cutoff like this would be an approximation. You’d actually think that the marginal disutility of pain starts out at zero, and steadily increases until it approaches one. If this were true, one dust speck would bring the pain to 1, which would make the marginal disutility slightly above zero, so that would have some tiny amount of badness. If you multiply it by 3^^^3, now it’s unimaginable.
It’s a thought experiment. The whole scenario is utterly far-fetched, so there’s no use in arguing that this or that detail of the thought experiment is what we should “expect” to find.
As such, I choose the version of the thought experiment that best teases out the dilemma that Yudkowsky is trying to explore, which concerns the question of whether we should consider pain to be denominated all in the same units—i.e. 3^^^3 x miniscule pain > 1 x torture—in our moral calculations.
EDIT: in response to the rest of your comment, see my reply to “Unnamed”.