Consistency effects seem to me to be the sort of error that more intelligent people might be MORE prone to, and is thus, it seems to me, particularly important to flag.
BTW, 3E seems to me to be by far the most important of the rationality suggestions given, largely because it actually seems practical.
The reference to your quote on OB and here about finding comfort in lying is confusing to me. Do you have a better explanation of what you mean because I am not understanding it at all.
Andrew, suppose you’re under strong social pressure to say X. The very thought of failing to affirm X, or of saying not-X, makes your stomach sink in fear—you know how your allies would respond that that, and it’d be terrible. You’d be friendless and ostracized. Or you’d hurt the feelings of someone close to you. Or you’d be unable to get people to help you with this thing you really want to do. Or whatever.
Under those circumstances, if you’re careful to always meticulously tell “your sincere beliefs”, you may well find yourself rationalizing, and telling yourself you “sincerely believe” X, quite apart from the evidence for X. Instead of telling lie X to others only, you’ll be liable to tell lie X to others and to yourself at once. You’ll guard your mind against “thoughtcrime”.
OTOH, if you leave yourself the possibility of saying X while believing Y, silently, inside your own mind, you’ll be more able to think though X’s truth or falsity honestly, because, when “what if X were true?” flashes through the corner of your mind, “I can’t think that; everyone will hate me” won’t follow as closely.
Personally I enjoy heresies too much to be worried about being biased against them, but that still leaves the problem of others responding negatively. I’m sure there are situations where lies are the least bad solution, but where possible I’d rather get good at avoiding questions, answering ambiguously or with some related opinion that you actually hold and that you know they will like, and so on. In addition to the point about eroding the moral authority of the rationalist community I’m somewhat worried about majority-pleasing lies amplifying groupthink (emperor’s new clothes, etc), liars being socially punished and ever after distrusted if found out, and false speech seeping into false beliefs through exactly the sort of mechanism described here (will we be tempted to believe consistently with past speech? does skill at lying to others translate into skill at lying to oneself? do there exist basic disgust responses against untruth that make rationality easier and that are eroded by emotional comfort with lying?) If nothing else I’d advise being clearly aware of it whenever you’re in “black hat mode” (for example, imagine quote marks around key words).
Consistency effects seem to me to be the sort of error that more intelligent people might be MORE prone to, and is thus, it seems to me, particularly important to flag.
BTW, 3E seems to me to be by far the most important of the rationality suggestions given, largely because it actually seems practical.
The reference to your quote on OB and here about finding comfort in lying is confusing to me. Do you have a better explanation of what you mean because I am not understanding it at all.
Andrew, suppose you’re under strong social pressure to say X. The very thought of failing to affirm X, or of saying not-X, makes your stomach sink in fear—you know how your allies would respond that that, and it’d be terrible. You’d be friendless and ostracized. Or you’d hurt the feelings of someone close to you. Or you’d be unable to get people to help you with this thing you really want to do. Or whatever.
Under those circumstances, if you’re careful to always meticulously tell “your sincere beliefs”, you may well find yourself rationalizing, and telling yourself you “sincerely believe” X, quite apart from the evidence for X. Instead of telling lie X to others only, you’ll be liable to tell lie X to others and to yourself at once. You’ll guard your mind against “thoughtcrime”.
OTOH, if you leave yourself the possibility of saying X while believing Y, silently, inside your own mind, you’ll be more able to think though X’s truth or falsity honestly, because, when “what if X were true?” flashes through the corner of your mind, “I can’t think that; everyone will hate me” won’t follow as closely.
Personally I enjoy heresies too much to be worried about being biased against them, but that still leaves the problem of others responding negatively. I’m sure there are situations where lies are the least bad solution, but where possible I’d rather get good at avoiding questions, answering ambiguously or with some related opinion that you actually hold and that you know they will like, and so on. In addition to the point about eroding the moral authority of the rationalist community I’m somewhat worried about majority-pleasing lies amplifying groupthink (emperor’s new clothes, etc), liars being socially punished and ever after distrusted if found out, and false speech seeping into false beliefs through exactly the sort of mechanism described here (will we be tempted to believe consistently with past speech? does skill at lying to others translate into skill at lying to oneself? do there exist basic disgust responses against untruth that make rationality easier and that are eroded by emotional comfort with lying?) If nothing else I’d advise being clearly aware of it whenever you’re in “black hat mode” (for example, imagine quote marks around key words).