To Eliezer’s list, I would add “Something To Protect” and the very end of “Circular Altruism”. When a friend of mine said something similar during a discussion of health care about not really wanting to be rational, I linked him to those two and summarized them like this (goes off and finds the discussion):
I don’t really care what you do on [the first thought experiment]. But I care very much what you do on [the second and third]. The importance of logic appears only when you have something that is more important to you than feeling good.
If your goal is to feel good, you can have whatever health system and whatever solution to the trolley problem makes you feel best. I mean, knowing that I didn’t let that poor old cancer patient die would make me feel really warm and fuzzy inside too. And I’d also feel really awful about pushing a fat man onto the tracks.
But if your goal is to save lives, you lose the right to do whatever you want, and you’d better start doing what’s logical. The logical solution to the two problems does, of course, save more lives than the warm fuzzy alternative.
So the question is: which is more important to you? Feeling good, or saving lives? As Overcoming Bias says:
“You know what? This isn’t about your feelings. A human life, with all its joys and all its pains, adding up over the course of decades, is worth far more than your brain’s feelings of comfort or discomfort with a plan. Does computing the expected utility feel too cold-blooded for your taste? Well, that feeling isn’t even a feather in the scales, when a life is at stake. Just shut up and multiply.”
If you’re using a different example with something less important than saving lives, maybe switch to something more important in the cosmic scheme of things. I’m very sympathetic to people who say good feelings are more important to them than a few extra bucks, and I don’t even think they’re being irrational most of the time. The more important the outcome, the more proportionately important rationality becomes than happy feelings.
To Eliezer’s list, I would add “Something To Protect” and the very end of “Circular Altruism”. When a friend of mine said something similar during a discussion of health care about not really wanting to be rational, I linked him to those two and summarized them like this (goes off and finds the discussion):
If you’re using a different example with something less important than saving lives, maybe switch to something more important in the cosmic scheme of things. I’m very sympathetic to people who say good feelings are more important to them than a few extra bucks, and I don’t even think they’re being irrational most of the time. The more important the outcome, the more proportionately important rationality becomes than happy feelings.