To consider it like this misses the point of the exercise: which is to treat each individual dust speck as the tiniest amount of disutility you can imagine, and multiply those tiny disutilities. If you treat the dust specks differently, as representing a small probability of a huge disutility(death) instead, the equation becomes different in the minds of many, because it’s now about adding small probabilities instead of adding small disutilities.
In short: you ought consider the least convenient world, in which you are assured that the momentary inconvenience/annoyance of the dust speck is all the disutility these people will suffer if you choose “dust specks”—they won’t be involved in accidents of any kind, etc.
Well, in that case, you’re specifying that the additional stress is being applied to people who can take it.
Let’s turn the whole danged thing around:
Would you rather that 3^^^^3 people got one less dust speck in their eyes (in times when dust specks were not the limiting factor on much more important activities), or prevent one person from being horribly tortured for 50 years?
A related question would go:
Would you volunteer not to have your dust speck count reduced, with it understood that A) if 3^^^3 people volunteer for this, someone will not be tortured, and B) there are well over 3^^^3 people being asked this, so it’s not well beyond futile.
If you can find 3^^^3 volunteers, must they be irrational, or are they just willing to take that tiny hit?
Question about your hypothetical: What happens if less than the necessary number of people volunteer for the dust speck? Do they A) get the dust speck to no purpose, or B) is their speck count reduced as though they hadn’t volunteered?
I wouldn’t volunteer either way, if you ignore irrational guilt trips like ArisKatsaris said. If there are 3^^^3 volunteers, they are irrational in situation B, and probably in situation A as well.
Would you rather that 3^^^^3 people got one less dust speck in their eyes (in times when dust specks were not the limiting factor on much more important activities), or prevent one person from being horribly tortured for 50 years?
That’s the same question effectively, so the former.
A related but similar question would go: Would you volunteer not to have your dust speck count reduced, with it understood that A) if 3^^^3 people volunteer for this, someone will not be tortured, and B) there are well over 3^^^3 people being asked this, so it’s not well beyond futile.
Yes, I would volunteer for this, but that’s just because I can rationally anticipate that if I denied so volunteering I’d be irrationally having guilt-trips over this, which would be significantly higher disutility in the long term than a dust-speck. In short I’d be comparing a dust speck to the disutility of irrational guilt, not the disutility of torture/3^^^3
Question about your hypothetical: What happens if less than the necessary number of people volunteer for the dust speck? Do they get the dust speck to no purpose, or is their speck count reduced as though they hadn’t volunteered?
I consider it a disutility to lose memories too, so switch that clause to “if you assume that you’re magically not going to have any feelings you consider irrational regarding this decision of yours, or otherwise face some social penalty more severe than than specific dust speck”, and I’ll say “no, I wouldn’t volunteer”.
To consider it like this misses the point of the exercise: which is to treat each individual dust speck as the tiniest amount of disutility you can imagine, and multiply those tiny disutilities. If you treat the dust specks differently, as representing a small probability of a huge disutility(death) instead, the equation becomes different in the minds of many, because it’s now about adding small probabilities instead of adding small disutilities.
In short: you ought consider the least convenient world, in which you are assured that the momentary inconvenience/annoyance of the dust speck is all the disutility these people will suffer if you choose “dust specks”—they won’t be involved in accidents of any kind, etc.
Well, in that case, you’re specifying that the additional stress is being applied to people who can take it.
Let’s turn the whole danged thing around: Would you rather that 3^^^^3 people got one less dust speck in their eyes (in times when dust specks were not the limiting factor on much more important activities), or prevent one person from being horribly tortured for 50 years?
A related question would go: Would you volunteer not to have your dust speck count reduced, with it understood that A) if 3^^^3 people volunteer for this, someone will not be tortured, and B) there are well over 3^^^3 people being asked this, so it’s not well beyond futile.
If you can find 3^^^3 volunteers, must they be irrational, or are they just willing to take that tiny hit?
Question about your hypothetical: What happens if less than the necessary number of people volunteer for the dust speck? Do they A) get the dust speck to no purpose, or B) is their speck count reduced as though they hadn’t volunteered?
I wouldn’t volunteer either way, if you ignore irrational guilt trips like ArisKatsaris said. If there are 3^^^3 volunteers, they are irrational in situation B, and probably in situation A as well.
That’s the same question effectively, so the former.
Yes, I would volunteer for this, but that’s just because I can rationally anticipate that if I denied so volunteering I’d be irrationally having guilt-trips over this, which would be significantly higher disutility in the long term than a dust-speck. In short I’d be comparing a dust speck to the disutility of irrational guilt, not the disutility of torture/3^^^3
Question about your hypothetical: What happens if less than the necessary number of people volunteer for the dust speck? Do they get the dust speck to no purpose, or is their speck count reduced as though they hadn’t volunteered?
So if they said they’d wipe your memory, you wouldn’t volunteer?
I consider it a disutility to lose memories too, so switch that clause to “if you assume that you’re magically not going to have any feelings you consider irrational regarding this decision of yours, or otherwise face some social penalty more severe than than specific dust speck”, and I’ll say “no, I wouldn’t volunteer”.