Well, first of all, I don’t see how that’s an example of scope insensitivity.
It might conceivably be worth the suffering of a few to correct inconsistencies, but the suffering of large numbers is about the worst thing there is—far worse than anybody’s reasoning errors.
EDIT: Actually, on second thought, you may be right: scope insensitivity may not be the fundamental problem here. It’s probably something more basic, like the fact that it’s just wrong to prioritize people’s-preferences-being-a-certain-way over avoiding suffering.
Getting upset and outraged at the existence of flu and human suffering is unlikely to change the universe’s mind. On the other hand, an inefficient response to that and other problems, making them worse, is very much our own fault. So it looks to me like that is very much a defensible position.
I don’t see how the defensibility of the original commenter’s position follows from the previous two sentences.
Whether it’s scope insensitivity/defensible or not can be resolved by clarifying two things:
1) Jordan_2010′s utility function 2) The purpose of disgust/{upset and outrage}
Say disgust is a feeling that arises only in response to certain types of ignorance, and a feeling which serves terminal values by neurochemically compelling one to reduce the ignorance, so increase awareness in such a way as to increase one’s utility.
Then disgust ‘would make sense at’ ignorance, and not at the terminal bad itself.
Eliezer gave another example: It might not be effective (‘unlikely to change the universe’s mind’) to be upset and outraged at matters of fact, and might be effective to be so at people with the power to reduce the utility-eating facts.
It might’ve been the case that it seemed initially that Jordan_2010 was suffering scope insensitivity due to a different initial sense of ‘disgust’, such as a general dismay that compels one to action. In that case, ceteris paribus, the terminal value should cause much more disgust, because it is the worse thing, and this general sense of disgust is more dense on terminal values than instrumental values. Then after reading Eliezer’s comment mentioning upset and outrage, your sense of disgust/etc. changed to something more like what I mentioned earlier in this comment.
It might conceivably be worth the suffering of a few to correct inconsistencies, but the suffering of large numbers is about the worst thing there is—far worse than anybody’s reasoning errors.
EDIT: Actually, on second thought, you may be right: scope insensitivity may not be the fundamental problem here. It’s probably something more basic, like the fact that it’s just wrong to prioritize people’s-preferences-being-a-certain-way over avoiding suffering.
I don’t see how the defensibility of the original commenter’s position follows from the previous two sentences.
Whether it’s scope insensitivity/defensible or not can be resolved by clarifying two things:
1) Jordan_2010′s utility function
2) The purpose of disgust/{upset and outrage}
Say disgust is a feeling that arises only in response to certain types of ignorance, and a feeling which serves terminal values by neurochemically compelling one to reduce the ignorance, so increase awareness in such a way as to increase one’s utility.
Then disgust ‘would make sense at’ ignorance, and not at the terminal bad itself.
Eliezer gave another example: It might not be effective (‘unlikely to change the universe’s mind’) to be upset and outraged at matters of fact, and might be effective to be so at people with the power to reduce the utility-eating facts.
It might’ve been the case that it seemed initially that Jordan_2010 was suffering scope insensitivity due to a different initial sense of ‘disgust’, such as a general dismay that compels one to action. In that case, ceteris paribus, the terminal value should cause much more disgust, because it is the worse thing, and this general sense of disgust is more dense on terminal values than instrumental values. Then after reading Eliezer’s comment mentioning upset and outrage, your sense of disgust/etc. changed to something more like what I mentioned earlier in this comment.