There are many ways of approaching this question, and one that I think is valuable and which I can’t find any mention of on this page of comments is the desirist approach.
Desirism is an ethical theory also sometimes called desire utilitarianism. The desirist approach has many details for which you can Google, but in general it is a form of consequentialism in which the relevant consequences are desire-satisfaction and desire-thwarting.
Fifty years of torture satisfies none and thwarts virtually all desires, especially the most intense desires, for fifty years of one individuals’ life, and most of the subsequent years of life also due to extreme psychological damage. Barely noticeable dust specks neither satisfy nor thwart any desires, and so in a population of any finite size the minor pain is of no account whatever in desirist terms. So a desirist would prefer the dust specks.
The Repetition Objection: If this choice was repeated say, a billion times, then the lives of the 3^^^3 people would become unlivable due to constant dust specks, and so at some point it must be that an additional individual tortured becomes preferable to another dust speck in 3^^^3 eyes.
The desirist response bites the bullet. Dust specks in eyes may increase linearly, but their effect on desire-satisfaction and desire-thwarting is highly nonlinear. It’s probably the case that an additional torture becomes preferable as soon as the expected marginal utility of the next dust speck is a few million desires thwarted, and certainly the case when the expected marginal utility of the next dust speck is a few billion desires thwarted.
Ah, yeah, that could be a problematic assumption. The grounds for my claim was generalization from my own experience. I have no consciously accessible desires which are affected by barely noticeable dust specks.
Fair enough. I don’t know what desirism has to say about consciously inaccessible desires, but leaving that aside for now… can you name an event that would thwart the most negligable desire to which you do have conscious access?
I have a high tolerance for chaotic surroundings, but even so I occasionally experience a weak, fleeting desire to impose greater order on other people’s belongings in my physical environment. It could be thwarted by an event like a fly buzzing around my head once, which though not painful at all would divert my attention long enough to ensure that the desire died without having been successfully acted on.
OK. So, if we assume for simplicity that a fly-buzzing event is the smallest measurable desire-thwarting event a human can experience, you can substitute “fly-buzz” for “dust speck” everywhere it appears here and translate the question into a desirist ethical reference frame.
The question in those terms becomes: is there some number of people, each of whom is experiencing a single fly-buzz, where the aggregated desire-thwarting caused by that aggregate event is worse than a much greater desire-thwarting event (e.g. the canonical 50 years of torture) experienced by one person?
Well, Yes, but then as stated earlier I think desirism bites the bullet on “dust speck”, too, given more dust specks. For a quick Fermi estimate, if I suppose that the fly-buzz-scenario takes about 5 seconds and is 1/1000th as strong (in some sense) as the desire not to be tortured for 5 seconds, then the number of people where the fly-buzz-scenarios outweight the torture is about a half trillion.
Granted, for people who don’t find desirism intuitive, this altered scenario changes nothing about the argument. I personally do find desirism intuitive, though unlikely to be a complete theory of ethics. So for me, given the dilemma between 50 years of torture of one individual and one dust-speck-in-eye or one fly-buzz-distraction for each of 3^^^3 people, I have a strong gut reaction of “Hell yes!” to preferring the specks and “Hell no!” to preferring the distractions.
There are many ways of approaching this question, and one that I think is valuable and which I can’t find any mention of on this page of comments is the desirist approach.
Desirism is an ethical theory also sometimes called desire utilitarianism. The desirist approach has many details for which you can Google, but in general it is a form of consequentialism in which the relevant consequences are desire-satisfaction and desire-thwarting.
Fifty years of torture satisfies none and thwarts virtually all desires, especially the most intense desires, for fifty years of one individuals’ life, and most of the subsequent years of life also due to extreme psychological damage. Barely noticeable dust specks neither satisfy nor thwart any desires, and so in a population of any finite size the minor pain is of no account whatever in desirist terms. So a desirist would prefer the dust specks.
The Repetition Objection: If this choice was repeated say, a billion times, then the lives of the 3^^^3 people would become unlivable due to constant dust specks, and so at some point it must be that an additional individual tortured becomes preferable to another dust speck in 3^^^3 eyes.
The desirist response bites the bullet. Dust specks in eyes may increase linearly, but their effect on desire-satisfaction and desire-thwarting is highly nonlinear. It’s probably the case that an additional torture becomes preferable as soon as the expected marginal utility of the next dust speck is a few million desires thwarted, and certainly the case when the expected marginal utility of the next dust speck is a few billion desires thwarted.
Can you clarify your grounds for claiming that barely noticeable dust specks neither satisfy nor thwart any desires?
Ah, yeah, that could be a problematic assumption. The grounds for my claim was generalization from my own experience. I have no consciously accessible desires which are affected by barely noticeable dust specks.
Fair enough. I don’t know what desirism has to say about consciously inaccessible desires, but leaving that aside for now… can you name an event that would thwart the most negligable desire to which you do have conscious access?
I have a high tolerance for chaotic surroundings, but even so I occasionally experience a weak, fleeting desire to impose greater order on other people’s belongings in my physical environment. It could be thwarted by an event like a fly buzzing around my head once, which though not painful at all would divert my attention long enough to ensure that the desire died without having been successfully acted on.
OK. So, if we assume for simplicity that a fly-buzzing event is the smallest measurable desire-thwarting event a human can experience, you can substitute “fly-buzz” for “dust speck” everywhere it appears here and translate the question into a desirist ethical reference frame.
The question in those terms becomes: is there some number of people, each of whom is experiencing a single fly-buzz, where the aggregated desire-thwarting caused by that aggregate event is worse than a much greater desire-thwarting event (e.g. the canonical 50 years of torture) experienced by one person?
And if not, why not?
Well, Yes, but then as stated earlier I think desirism bites the bullet on “dust speck”, too, given more dust specks. For a quick Fermi estimate, if I suppose that the fly-buzz-scenario takes about 5 seconds and is 1/1000th as strong (in some sense) as the desire not to be tortured for 5 seconds, then the number of people where the fly-buzz-scenarios outweight the torture is about a half trillion.
Granted, for people who don’t find desirism intuitive, this altered scenario changes nothing about the argument. I personally do find desirism intuitive, though unlikely to be a complete theory of ethics. So for me, given the dilemma between 50 years of torture of one individual and one dust-speck-in-eye or one fly-buzz-distraction for each of 3^^^3 people, I have a strong gut reaction of “Hell yes!” to preferring the specks and “Hell no!” to preferring the distractions.
Ah. I think I misunderstood you initially, then. Thanks for the clarification.