“Due to an unexpected mental glitch, he threatens Joe again. Joe follows his disposition and ignores the threat. BOOM.
Here Joe’s final decision seems as disastrously foolish as Tom’s slip up.”
But of course, the initial decision to take the pill may be rational, and the “final decision” is constrained so much that we might regard it as a “decision” in name only. The way I see it: When Joe takes the pill, he will stop rational versions of Tom from threatening him, meaning he benefits, but will be at increased risk of irrational versions of Tom threatening him, meaning he loses. Whether the decision to take the pill is rational depends on how many rational versions of Tom he thinks are out there and how many irrational ones there are, as well as the relative costs of being forced to shine shoes and being blown up. If Toms tend to be rational, and shining shoes is unpleasant enough, taking the pill may be rational.
This kind of scenario has made me think in the past: Could this have contributed to some of our emotional tendencies? At times, we experience emotions that over-ride our rational behavior. Anger is a good example, though gratitude might be as well. There may be times when it is not just rational, in terms of reward and cost, to hit back at someone who has wrong us, but we may do anyway because we are angry. However, if we never got angry, and acted rationally all the time, we may be easy targets for people who know that they can wrong us and then retreat to some safe situation where revenge would be irrational. Something that can reduce our rationality, so that we act even when it is not in our interests, might, almost paradoxically, be a good thing for us, because it would make it less rational to attack us like this in the first place. Maybe anger is partly there for that reason—literally to ensure that we will actually do things that get ourselves killed to hit back at someone, as a deterrent.
Of course, someone could ask how people are supposed to know we have that tendency—but when people saw anger working in themselves and others they would generally get the idea—they would understand the consequences of reduced rationality in some situations. It could be argued that the best strategy is to fake your ability to become angry. Maybe you become angry in trivial situations, where the cost of the anger is minimal, while in the extreme situation where you are likely to get killed you act rationally, but a problem with this is that it is more complicated behavior, so we might assume that it is harder for it get evolved in the first place. There would presumably be some kind of balance between real deterrence and fake deterrence at work here.
I can think of real-world examples of this “pill”. I think there is supposed to be one wealthy person who told his family that if he was kidnapped a ransom was not to be paid under any circumstances. Now, clearly, his family are likely to ignore that and pay: Any deterrence has failed and the rational thing is to save his life. That suggests that he may have taken precautions: He may have done his best to make it impossible for his family to pay a ransom.
Two cars race toward each other on an empty freeway; the first to swerve is the chicken. How should you play if you want to preserve both your status and your life? The answer is clear: in full view of your opponent, rip out your car’s steering wheel, blindfold yourself, down a bottle of Jack Daniels, scream. If you can persuade your opponent that you’re incapable of making the decision to swerve, then he has to swerve. In other words: the stupider, more ignorant, more irrational you can prove you are, the better the chance you have of winning.
We aren’t transparent. The only reason to fulfill our threats is to make it so later people will know that we will, in which case it’s totally rational by any decision theory.
These “pills” and “dispositions” are equivalent to pre-commitments. If you’re interested in the math and some interesting examples, I’d suggest reading The Strategy of Conflict.
“Due to an unexpected mental glitch, he threatens Joe again. Joe follows his disposition and ignores the threat. BOOM. Here Joe’s final decision seems as disastrously foolish as Tom’s slip up.”
But of course, the initial decision to take the pill may be rational, and the “final decision” is constrained so much that we might regard it as a “decision” in name only. The way I see it: When Joe takes the pill, he will stop rational versions of Tom from threatening him, meaning he benefits, but will be at increased risk of irrational versions of Tom threatening him, meaning he loses. Whether the decision to take the pill is rational depends on how many rational versions of Tom he thinks are out there and how many irrational ones there are, as well as the relative costs of being forced to shine shoes and being blown up. If Toms tend to be rational, and shining shoes is unpleasant enough, taking the pill may be rational.
This kind of scenario has made me think in the past: Could this have contributed to some of our emotional tendencies? At times, we experience emotions that over-ride our rational behavior. Anger is a good example, though gratitude might be as well. There may be times when it is not just rational, in terms of reward and cost, to hit back at someone who has wrong us, but we may do anyway because we are angry. However, if we never got angry, and acted rationally all the time, we may be easy targets for people who know that they can wrong us and then retreat to some safe situation where revenge would be irrational. Something that can reduce our rationality, so that we act even when it is not in our interests, might, almost paradoxically, be a good thing for us, because it would make it less rational to attack us like this in the first place. Maybe anger is partly there for that reason—literally to ensure that we will actually do things that get ourselves killed to hit back at someone, as a deterrent.
Of course, someone could ask how people are supposed to know we have that tendency—but when people saw anger working in themselves and others they would generally get the idea—they would understand the consequences of reduced rationality in some situations. It could be argued that the best strategy is to fake your ability to become angry. Maybe you become angry in trivial situations, where the cost of the anger is minimal, while in the extreme situation where you are likely to get killed you act rationally, but a problem with this is that it is more complicated behavior, so we might assume that it is harder for it get evolved in the first place. There would presumably be some kind of balance between real deterrence and fake deterrence at work here.
I can think of real-world examples of this “pill”. I think there is supposed to be one wealthy person who told his family that if he was kidnapped a ransom was not to be paid under any circumstances. Now, clearly, his family are likely to ignore that and pay: Any deterrence has failed and the rational thing is to save his life. That suggests that he may have taken precautions: He may have done his best to make it impossible for his family to pay a ransom.
This reminds me of a quote from Scott Aaronson’s On Self-Delusion and Bounded Rationality :
We aren’t transparent. The only reason to fulfill our threats is to make it so later people will know that we will, in which case it’s totally rational by any decision theory.
These “pills” and “dispositions” are equivalent to pre-commitments. If you’re interested in the math and some interesting examples, I’d suggest reading The Strategy of Conflict.
Yep (I actually discuss the case of emotions in the linked post!)