Or what thoughts do you have regarding Michael Vassar’s suggestion to practice lying?
(Reusing an old joke)
Q: What’s the difference between a creationist preacher and a rationalist?
A: The rationalist knows when he’s lying.
I’m having trouble resolving 2a and 3b.
2a. Hyper-vigilant honesty. Take care never to say anything but what is best supported by the evidence, aloud or to yourself, lest you come to believe it.
3b. Build emotional comfort with lying, so you won’t be tempted to rationalize your last week’s false claim, or your next week’s political convenience. Perhaps follow Michael Vassar’s suggestion to lie on purpose in some unimportant contexts.
I find myself rejecting 3b as a useful practice because:
What I think will be an unimportant and undetectable lie has a finite probability of being detected and considered important by someone whose confidence I value. See Entangled Truths, Contagious Lies
This post points out the dangers of self-delusion from motivated small lies e.g. “if I hang out with a bunch of Green Sky-ers, and I make small remarks that accord with the Green Sky position so that they’ll like me, I’m liable to end up a Green Sky-er myself.” Is there any evidence to show that I’ll be safer from my own lies if I deliberately tag them at the time I tell them?
Building rationalism as a movement to improve humanity doesn’t need to be encumbered by accusations that the movement encourages dishonesty. Even though one might justify the practice of telling unimportant lies as a means to prevent a larger more problematic bias, advocating lies at any level is begging to be quote-mined and portrayed as fundamentally immoral.
The justification for 3b (“so you won’t be tempted to rationalize your last week’s false claim, or your next week’s political convenience.”) doesn’t work for me. I don’t know if I’m different, but I find that I have far more respect for people (particularly politicians) who admit they were wrong.
Rather than practising being emotionally comfortable lying, I’d rather practise being comfortable with acknowledging fallibility.
I did preface my list with “I’m not recommending these, just putting them out there for consideration”. 2a and 3b contradict one another in the sense that one cannot fully practice both 2a and 3b; but each is worth considering. Also, many of us could do more of both 2a and 3b than we currently do—we could be more careful to really only every tell ourselves what’s best supported by the evidence (rather than pleasant rationalizations), and to mostly only say this to others as well, while also making the option of lying more cognitively available.
Is there any evidence to show that I’ll be safer from my own lies if I deliberately tag them at the time I tell them?
There’s good evidence that people paid $20 to lie were less likely to believe their lie than people paid a mere $1 to lie. And similarly in a variety of other studies: people under strong, visible external pressure to utter particular types of speech are less likely to later believe that speech. It’s plausible, though not obvious, that people who see themselves as intentionally manipulating others, as continually making up contradictory stories, etc. will also be less likely to take their own words as true.
Building rationalism as a movement to improve humanity doesn’t need to be encumbered by accusations that the movement encourages dishonesty.
I agree this is a potential concern.
Rather than practising being emotionally comfortable lying, I’d rather practise being comfortable with acknowledging fallibility.
Vassar’s suggestion isn’t designed to help one avoid noticing one’s own past mistakes. That one really wouldn’t work for a rationalist. It’s designed to let you seriously consider ideas that others may disapprove of, while continuing to function in ordinary social environments, i.e. social environments that may demand lip service to said ideas. See my comment here.
(Reusing an old joke) Q: What’s the difference between a creationist preacher and a rationalist? A: The rationalist knows when he’s lying.
I’m having trouble resolving 2a and 3b.
I find myself rejecting 3b as a useful practice because:
What I think will be an unimportant and undetectable lie has a finite probability of being detected and considered important by someone whose confidence I value. See Entangled Truths, Contagious Lies
This post points out the dangers of self-delusion from motivated small lies e.g. “if I hang out with a bunch of Green Sky-ers, and I make small remarks that accord with the Green Sky position so that they’ll like me, I’m liable to end up a Green Sky-er myself.” Is there any evidence to show that I’ll be safer from my own lies if I deliberately tag them at the time I tell them?
Building rationalism as a movement to improve humanity doesn’t need to be encumbered by accusations that the movement encourages dishonesty. Even though one might justify the practice of telling unimportant lies as a means to prevent a larger more problematic bias, advocating lies at any level is begging to be quote-mined and portrayed as fundamentally immoral.
The justification for 3b (“so you won’t be tempted to rationalize your last week’s false claim, or your next week’s political convenience.”) doesn’t work for me. I don’t know if I’m different, but I find that I have far more respect for people (particularly politicians) who admit they were wrong.
Rather than practising being emotionally comfortable lying, I’d rather practise being comfortable with acknowledging fallibility.
I did preface my list with “I’m not recommending these, just putting them out there for consideration”. 2a and 3b contradict one another in the sense that one cannot fully practice both 2a and 3b; but each is worth considering. Also, many of us could do more of both 2a and 3b than we currently do—we could be more careful to really only every tell ourselves what’s best supported by the evidence (rather than pleasant rationalizations), and to mostly only say this to others as well, while also making the option of lying more cognitively available.
There’s good evidence that people paid $20 to lie were less likely to believe their lie than people paid a mere $1 to lie. And similarly in a variety of other studies: people under strong, visible external pressure to utter particular types of speech are less likely to later believe that speech. It’s plausible, though not obvious, that people who see themselves as intentionally manipulating others, as continually making up contradictory stories, etc. will also be less likely to take their own words as true.
I agree this is a potential concern.
Vassar’s suggestion isn’t designed to help one avoid noticing one’s own past mistakes. That one really wouldn’t work for a rationalist. It’s designed to let you seriously consider ideas that others may disapprove of, while continuing to function in ordinary social environments, i.e. social environments that may demand lip service to said ideas. See my comment here.