Yes, there is sometimes a trade-off between truth and optimal signaling; and in those cases, if you aren’t willing to lie or good at lying, rationality makes your signaling worse. But not always; there are far more cases where it’s just a matter of recognizing what you’re signaling and how, and fixing any incorrect signals. In those cases, rationality makes your signaling better. I believe the effects from the second case are usually stronger, so that becoming more rational represents a net gain, albeit a smaller gain than if everyone loved interacting with honest truth-seekers.
Well one of the questions in the op is precisely about fixing the signals. How exactly does one go about that? I’m basically asking if this application of rationality can or should be discussed on LW in a way similar to lets say the discussion on akrasia.
Yes, there is sometimes a trade-off between truth and optimal signaling; and in those cases, if you aren’t willing to lie or good at lying, rationality makes your signaling worse. But not always; there are far more cases where it’s just a matter of recognizing what you’re signaling and how, and fixing any incorrect signals. In those cases, rationality makes your signaling better. I believe the effects from the second case are usually stronger, so that becoming more rational represents a net gain, albeit a smaller gain than if everyone loved interacting with honest truth-seekers.
Well one of the questions in the op is precisely about fixing the signals. How exactly does one go about that? I’m basically asking if this application of rationality can or should be discussed on LW in a way similar to lets say the discussion on akrasia.