The 3^^^3 dust specks vs torture dillema is an axis that utilitarians can vary on.
Most utilitarians on Felicifia understand scope insensitivity and will prefer a small amount of torture.
Of the rest, some believe in fundamentally different grades of suffering.
Or, you know, they could weight suffering in a continuous, derivable way that doesn’t make a fundamental distinction in theory, but achieves that result in practice; amputating a finger is worth more than a billion blood-pricks, one broken arm is worth more than a billion billion nudges, and so on.
Or maybe we’re going at it completely wrong and the models for quantifying overall suffering are completely inadequate to the subject matter. If pain functioned like a sound, and that an order of magnitude increase would register as a linear increase, you could stack billions of the lower pains without the resulting pain registering very high. And so on.
Or maybe it’s completely different than that. My point is, the dust speck question is more of a question on how human psychology of pain and reciprocity works than on the merits of some forms of utilitarianism and deontologism, which I feel are only approximations towards modelling said psychology.
Or, you know, they could weight suffering in a continuous, derivable way that doesn’t make a fundamental distinction in theory, but achieves that result in practice; amputating a finger is worth more than a billion blood-pricks, one broken arm is worth more than a billion billion nudges, and so on.
That’s not (at all realistically) possible with a number as large as 3^^^3. If there is a number large enough to make a difference 3^^^3 is larger than that number. You say “and so on”, but you could list a billion things each second, each a billion times worse than the preceding, continue doing so until the heat death of the universe and you still wouldn’t get anywhere close to a difference even worth mentioning when there’s a factor of 3^^^3 involved.
Then how about we take the human brain’s inability to multiply into account? Above a certain number of people, the brain goes numb to any increments, in suffering or otherwise. Then it wouldn’t matter if you’re 3^^^3, 3 million, or even 3 thousand; anything past a certain limit is just background noise, statistics.
Which I suppose would have an interesting effect on the value of genocides and other mass-scales inflictions of suffering, and donation management and other mass-scales alleviations of such. I guess what really matters is the tangible result to you, those close to you whom you care about, and the more immediate social environment you move in.
You’d care about the state of a neighborhood, not because you care about any of them individually; you don’t even know them. No, you just want to walk around happy people so you can feel happy yourself. Depressed people are depressing. A utilitarian, linear calculation of wealth increase (or even one that’d include a law of diminishing returns) is simply a very rough approximation towards this goal of seeing smiling faces.
And then there’s of course the matter of satisfying your values, which has much more to do with the state of your mind than with that of others’.
And this is the limit of my working memory for today. I’ll go mull this over… Of course, I suppose I’m hardly being original here; could you point me to sources that have already thought over all this? I’d hate to find out I’m wasting brain-time reinventing the wheel.
The Felicifia forum for utilitarians has an overlapping userbase and is nearby to your suggestion in concept-space: http://felicifia.org/
What did they make of the dust speck dilemma?
The 3^^^3 dust specks vs torture dillema is an axis that utilitarians can vary on.
Most utilitarians on Felicifia understand scope insensitivity and will prefer a small amount of torture. Of the rest, some believe in fundamentally different grades of suffering.
Or, you know, they could weight suffering in a continuous, derivable way that doesn’t make a fundamental distinction in theory, but achieves that result in practice; amputating a finger is worth more than a billion blood-pricks, one broken arm is worth more than a billion billion nudges, and so on.
Or maybe we’re going at it completely wrong and the models for quantifying overall suffering are completely inadequate to the subject matter. If pain functioned like a sound, and that an order of magnitude increase would register as a linear increase, you could stack billions of the lower pains without the resulting pain registering very high. And so on.
Or maybe it’s completely different than that. My point is, the dust speck question is more of a question on how human psychology of pain and reciprocity works than on the merits of some forms of utilitarianism and deontologism, which I feel are only approximations towards modelling said psychology.
That’s not (at all realistically) possible with a number as large as 3^^^3. If there is a number large enough to make a difference 3^^^3 is larger than that number. You say “and so on”, but you could list a billion things each second, each a billion times worse than the preceding, continue doing so until the heat death of the universe and you still wouldn’t get anywhere close to a difference even worth mentioning when there’s a factor of 3^^^3 involved.
Then how about we take the human brain’s inability to multiply into account? Above a certain number of people, the brain goes numb to any increments, in suffering or otherwise. Then it wouldn’t matter if you’re 3^^^3, 3 million, or even 3 thousand; anything past a certain limit is just background noise, statistics.
Which I suppose would have an interesting effect on the value of genocides and other mass-scales inflictions of suffering, and donation management and other mass-scales alleviations of such. I guess what really matters is the tangible result to you, those close to you whom you care about, and the more immediate social environment you move in.
You’d care about the state of a neighborhood, not because you care about any of them individually; you don’t even know them. No, you just want to walk around happy people so you can feel happy yourself. Depressed people are depressing. A utilitarian, linear calculation of wealth increase (or even one that’d include a law of diminishing returns) is simply a very rough approximation towards this goal of seeing smiling faces.
And then there’s of course the matter of satisfying your values, which has much more to do with the state of your mind than with that of others’.
And this is the limit of my working memory for today. I’ll go mull this over… Of course, I suppose I’m hardly being original here; could you point me to sources that have already thought over all this? I’d hate to find out I’m wasting brain-time reinventing the wheel.