Actually, I think your right. The escalation argument has caught me in a contradiction. I wonder why I didn’t see it last time around.
I still prefer the specs though. My prior in favor of the specs is strong enough that I have to conclude that there’s something wrong with the escalation argument that I’m not presently clever enough to find. It’s a bit like reading a proof that 2+2 = 5. You know you’ve just read a proof, and you checked each step, but you still, justifiably, don’t believe it. It’s far more likely that the proof fooled you in some subtle way than it is that arithmetic is actually inconsistent.
Well, we have better reasons to believe that arithmetic is consistent than we have to believe that human beings’ strong moral impulses are coherent in cases outside of everyday experience. I think much of the point of the SPECKS vs. TORTURE debate was to emphasize that our moral intuitions aren’t perceptions of a consistent world of values, but instead a thousand shards of moral desire which originated in a thousand different aspects of primate social life.
For one thing, our moral intuitions don’t shut up and multiply. When we start making decisions that affect large numbers of people (3^^^3 isn’t necessary; a million is enough to take us far outside of our usual domain), it’s important to be aware that the actual best action might sometimes trigger a wave of moral disgust, if the harm to a few seems more salient than the benefit to the many, etc.
Keep in mind that this isn’t arguing for implementing Utilitarianism of the “kill a healthy traveler and harvest his organs to save 10 other people” variety; among its faults, that kind of Utilitarianism fails to consider its probable consequences on human behavior if people know it’s being implemented. The circularity of “SPECKS” just serves to point out one more domain in which Eliezer’s Maxim applies:
This came to mind: What you intuitively believe about a certain statement may as well be described as an “emotion” of “truthiness”, triggered by the focus of attention holding the model just like any other emotion that values situations. Emotion isn’t always right, estimate of plausibility isn’t always right, but these are basically the same thing. I somehow used to separate them, along the line of probability-utility distinction, but this is probably more confusing then helpful a distinction, with truthiness on its own and the concept of emotions containing everything but it.
Yup. I get all that. I still want to go for the specs.
Perhaps it has to do with the fact that 3^^^3 is way more people than could possibly exist. Perhaps the specs v. torture hypothetical doesn’t actually matter. I don’t know. But I’m just not convinced.
Actually, I think your right. The escalation argument has caught me in a contradiction. I wonder why I didn’t see it last time around.
I still prefer the specs though. My prior in favor of the specs is strong enough that I have to conclude that there’s something wrong with the escalation argument that I’m not presently clever enough to find. It’s a bit like reading a proof that 2+2 = 5. You know you’ve just read a proof, and you checked each step, but you still, justifiably, don’t believe it. It’s far more likely that the proof fooled you in some subtle way than it is that arithmetic is actually inconsistent.
Well, we have better reasons to believe that arithmetic is consistent than we have to believe that human beings’ strong moral impulses are coherent in cases outside of everyday experience. I think much of the point of the SPECKS vs. TORTURE debate was to emphasize that our moral intuitions aren’t perceptions of a consistent world of values, but instead a thousand shards of moral desire which originated in a thousand different aspects of primate social life.
For one thing, our moral intuitions don’t shut up and multiply. When we start making decisions that affect large numbers of people (3^^^3 isn’t necessary; a million is enough to take us far outside of our usual domain), it’s important to be aware that the actual best action might sometimes trigger a wave of moral disgust, if the harm to a few seems more salient than the benefit to the many, etc.
Keep in mind that this isn’t arguing for implementing Utilitarianism of the “kill a healthy traveler and harvest his organs to save 10 other people” variety; among its faults, that kind of Utilitarianism fails to consider its probable consequences on human behavior if people know it’s being implemented. The circularity of “SPECKS” just serves to point out one more domain in which Eliezer’s Maxim applies:
This came to mind: What you intuitively believe about a certain statement may as well be described as an “emotion” of “truthiness”, triggered by the focus of attention holding the model just like any other emotion that values situations. Emotion isn’t always right, estimate of plausibility isn’t always right, but these are basically the same thing. I somehow used to separate them, along the line of probability-utility distinction, but this is probably more confusing then helpful a distinction, with truthiness on its own and the concept of emotions containing everything but it.
Yup. I get all that. I still want to go for the specs.
Perhaps it has to do with the fact that 3^^^3 is way more people than could possibly exist. Perhaps the specs v. torture hypothetical doesn’t actually matter. I don’t know. But I’m just not convinced.
Just give up already! Intuition isn’t always right!