I worry that Radical Honesty would selectively disadvantage rationalists in human relationships. Broadcasting your opinions is much easier when you can deceive yourself about anything you’d feel uncomfortable saying to others. I wonder whether practitioners of Radical Honesty tend to become more adept at self-deception, as they stop being able to tell white lies or admit private thoughts to themselves. I have taken a less restrictive kind of honesty upon myself—to avoid statements that are literally false - and I know that this becomes more and more difficult, more and more of a disadvantage, as I deceive myself less and less.
I think that if there is ever a vow of honesty among rationalists, it will be restricted in scope. Normally, perhaps, you would avoid making statements that were literally false, and be ready to accept brutal honesty from anyone who first said “Crocker’s Rules”. Maybe you would be Radically Honest, but only with others who had taken a vow of Radical Honesty, and who understood the trust required to tell someone the truth.
Probably there is a risk of not conscious insincerity, but simply our “elephants” betraying us. Whenever you honestly say an unpopular thing, you feel the social punishment, and you are conditioned against the whole causal chain that resulted in this event. Some parts of the chain, such as deciding to speak openly, you consciously refuse to change. So your “elephant” may change some previous part, such as believing something that is against the group consensus, or noticing that your beliefs are different and you should comment on that.
To prevent this, we would also need a policy to not punish people socially for saying unpopular things. But if we make this a general rule, we could die by pacifism. So… perhaps doing this only in specific situations, with people already filtered by some other process?
By the way, so did Eliezer:
Probably there is a risk of not conscious insincerity, but simply our “elephants” betraying us. Whenever you honestly say an unpopular thing, you feel the social punishment, and you are conditioned against the whole causal chain that resulted in this event. Some parts of the chain, such as deciding to speak openly, you consciously refuse to change. So your “elephant” may change some previous part, such as believing something that is against the group consensus, or noticing that your beliefs are different and you should comment on that.
To prevent this, we would also need a policy to not punish people socially for saying unpopular things. But if we make this a general rule, we could die by pacifism. So… perhaps doing this only in specific situations, with people already filtered by some other process?