I did preface my list with “I’m not recommending these, just putting them out there for consideration”. 2a and 3b contradict one another in the sense that one cannot fully practice both 2a and 3b; but each is worth considering. Also, many of us could do more of both 2a and 3b than we currently do—we could be more careful to really only every tell ourselves what’s best supported by the evidence (rather than pleasant rationalizations), and to mostly only say this to others as well, while also making the option of lying more cognitively available.
Is there any evidence to show that I’ll be safer from my own lies if I deliberately tag them at the time I tell them?
There’s good evidence that people paid $20 to lie were less likely to believe their lie than people paid a mere $1 to lie. And similarly in a variety of other studies: people under strong, visible external pressure to utter particular types of speech are less likely to later believe that speech. It’s plausible, though not obvious, that people who see themselves as intentionally manipulating others, as continually making up contradictory stories, etc. will also be less likely to take their own words as true.
Building rationalism as a movement to improve humanity doesn’t need to be encumbered by accusations that the movement encourages dishonesty.
I agree this is a potential concern.
Rather than practising being emotionally comfortable lying, I’d rather practise being comfortable with acknowledging fallibility.
Vassar’s suggestion isn’t designed to help one avoid noticing one’s own past mistakes. That one really wouldn’t work for a rationalist. It’s designed to let you seriously consider ideas that others may disapprove of, while continuing to function in ordinary social environments, i.e. social environments that may demand lip service to said ideas. See my comment here.
I did preface my list with “I’m not recommending these, just putting them out there for consideration”. 2a and 3b contradict one another in the sense that one cannot fully practice both 2a and 3b; but each is worth considering. Also, many of us could do more of both 2a and 3b than we currently do—we could be more careful to really only every tell ourselves what’s best supported by the evidence (rather than pleasant rationalizations), and to mostly only say this to others as well, while also making the option of lying more cognitively available.
There’s good evidence that people paid $20 to lie were less likely to believe their lie than people paid a mere $1 to lie. And similarly in a variety of other studies: people under strong, visible external pressure to utter particular types of speech are less likely to later believe that speech. It’s plausible, though not obvious, that people who see themselves as intentionally manipulating others, as continually making up contradictory stories, etc. will also be less likely to take their own words as true.
I agree this is a potential concern.
Vassar’s suggestion isn’t designed to help one avoid noticing one’s own past mistakes. That one really wouldn’t work for a rationalist. It’s designed to let you seriously consider ideas that others may disapprove of, while continuing to function in ordinary social environments, i.e. social environments that may demand lip service to said ideas. See my comment here.