How do you feel about the torture vs dust speck situation if you expected to encounter that situation 3^^^3 times, knowing that 3^^^3 dust specks are much worse that 50 years of torture?
More interestingly, have you seen that aggregation argument before, and does it do something inside your mind? Might that be a form of “moral learning”?
I can parse your comment a couple of different ways, so I will discuss multiple interpretations but forgive me if I’ve misunderstood.
If we are talking about 3^^^3 dust specks experienced by that many different people, then it doesn’t change my intuition. My early exposure to the question included such unimaginably large numbers of people. I recognize scope insensitivity may be playing a role here, but I think there is more to it.
If we are talking about myself or some other individual experiencing 3^^^3 dust specks (or 3^^^3 people each experiencing 3^^^3 dust specks), then my intuition considers that a different situation. A single individual experiencing that many dust specks seems to amount to torture. Indeed, it may be worse than 50 years of regular torture because it may consume many more years to experience them all. I don’t think of that as “moral learning” because it doesn’t alter my position on the former case.
If I have to try to explain what is going on here in a systematic framework, I’d say the following. Splitting up harm among multiple people can be better than applying it to one person. For example, one person stubbing a toe on two different occasions is marginally worse than two people each stubbing one toe. Harms/moral offenses may separate into different classes such that no amount of a lower class can rise to match a higher class. For example, there may be no number of rodent murders that is morally worse than a single human murder. Duration of harm can outweigh intensity. For example, imagine mild electric shocks that are painful but don’t cause injury and furthermore that receiving one followed by another doesn’t make the second any more physically painful. Some slightly more intense shocks over a short time may be better than many more mild shocks over a long time. This comes in when weighing 50 years of torture vs 3^^^3 dusk specks experienced by one person though it is much harder to make the evaluation.
Those explanations feel a little like confabulations and rationalizations. However, they don’t seem to be any more so than a total utilitarianism or average utilitarianism explanation for some moral intuitions. They do, however, give some intuition why a simple utilitarian approach may not be the “obviously correct” moral framework.
If I failed to address the “aggregation argument,” please clarify what you are referring to.
What I meant was this: assume that 3^^^3 dust specks on one person is worse that 50 years of torture. As long as the dust specks sensation is somewhat additive, that should be true. Now suppose you have to choose between dust specks and torture 3^^^3 times, one for each person (“so, do we torture individual 27602, or one dust speck on everyone? Now, same question for 27603....).
Then always choosing dust specks is worse, for everyone, than always choosing torture.
So the dust-speck decision becomes worse and worse, the more often you expect to encounter it.
How do you feel about the torture vs dust speck situation if you expected to encounter that situation 3^^^3 times, knowing that 3^^^3 dust specks are much worse that 50 years of torture?
More interestingly, have you seen that aggregation argument before, and does it do something inside your mind? Might that be a form of “moral learning”?
I can parse your comment a couple of different ways, so I will discuss multiple interpretations but forgive me if I’ve misunderstood.
If we are talking about 3^^^3 dust specks experienced by that many different people, then it doesn’t change my intuition. My early exposure to the question included such unimaginably large numbers of people. I recognize scope insensitivity may be playing a role here, but I think there is more to it.
If we are talking about myself or some other individual experiencing 3^^^3 dust specks (or 3^^^3 people each experiencing 3^^^3 dust specks), then my intuition considers that a different situation. A single individual experiencing that many dust specks seems to amount to torture. Indeed, it may be worse than 50 years of regular torture because it may consume many more years to experience them all. I don’t think of that as “moral learning” because it doesn’t alter my position on the former case.
If I have to try to explain what is going on here in a systematic framework, I’d say the following. Splitting up harm among multiple people can be better than applying it to one person. For example, one person stubbing a toe on two different occasions is marginally worse than two people each stubbing one toe. Harms/moral offenses may separate into different classes such that no amount of a lower class can rise to match a higher class. For example, there may be no number of rodent murders that is morally worse than a single human murder. Duration of harm can outweigh intensity. For example, imagine mild electric shocks that are painful but don’t cause injury and furthermore that receiving one followed by another doesn’t make the second any more physically painful. Some slightly more intense shocks over a short time may be better than many more mild shocks over a long time. This comes in when weighing 50 years of torture vs 3^^^3 dusk specks experienced by one person though it is much harder to make the evaluation.
Those explanations feel a little like confabulations and rationalizations. However, they don’t seem to be any more so than a total utilitarianism or average utilitarianism explanation for some moral intuitions. They do, however, give some intuition why a simple utilitarian approach may not be the “obviously correct” moral framework.
If I failed to address the “aggregation argument,” please clarify what you are referring to.
What I meant was this: assume that 3^^^3 dust specks on one person is worse that 50 years of torture. As long as the dust specks sensation is somewhat additive, that should be true. Now suppose you have to choose between dust specks and torture 3^^^3 times, one for each person (“so, do we torture individual 27602, or one dust speck on everyone? Now, same question for 27603....).
Then always choosing dust specks is worse, for everyone, than always choosing torture.
So the dust-speck decision becomes worse and worse, the more often you expect to encounter it.
27602 may beg to differ.