A person who has a more realistic self-image than average might appear less nice than an average person who is equally nice. Thus, the choice to improve your epistemic rationality also causes you to implicitly lie to people you interact with about you being a less nice person than you actually are.
I understand your first sentence, and agree ceteris paribus (but I think the person with the realistic beliefs is in a better position to become actually nicer). Your second makes no sense to me. How is it implicitly lying to have accurate beliefs about how nice you are? The other way around seems more plausible.
The improved accuracy is the property of your own beliefs about yourself, not of other people’s beliefs about you. By increasing the accuracy of your beliefs about yourself, you simultaneously decrease the accuracy of other people’s beliefs about yourself (unless you compensate by additional signalling by other means, which may be impossible in a number of cases). Consciously compromising accuracy of other people’s beliefs is usually called lying, or at least not technically lying.
I think that may be the most roundabout and head-spinny justification for self-deception I’ve ever heard. Wow. By a similar token, should I not take up gardening if it’s not within my power to update everyone who has the belief that I don’t garden?
I think that may be the most roundabout and head-spinny justification for self-deception I’ve ever heard.
Note that I don’t endorse self-deception, see my other comment in this thread. But the argument points to a negative trait of the choice. (The argument is related to a stance that as a rationalist, you’d want to use rhetoric as much as is common (but not more), to avoid signaling the incorrect fact of weakness of your position.)
By a similar token, should I not take up gardening if it’s not within my power to update everyone who has the belief that I don’t garden?
Normally, if you take up gardening, other people’s level of belief will either be unchanged (prior state of knowledge: they don’t have new evidence), or will move up (towards the truth) upon receiving new evidence. Here, the situation is reversed: new evidence (not new action—this is a point where your analogy breaks) will move people’s belief away from the truth.
A person who has a more realistic self-image than average might appear less nice than an average person who is equally nice. Thus, the choice to improve your epistemic rationality also causes you to implicitly lie to people you interact with about you being a less nice person than you actually are.
I understand your first sentence, and agree ceteris paribus (but I think the person with the realistic beliefs is in a better position to become actually nicer). Your second makes no sense to me. How is it implicitly lying to have accurate beliefs about how nice you are? The other way around seems more plausible.
The improved accuracy is the property of your own beliefs about yourself, not of other people’s beliefs about you. By increasing the accuracy of your beliefs about yourself, you simultaneously decrease the accuracy of other people’s beliefs about yourself (unless you compensate by additional signalling by other means, which may be impossible in a number of cases). Consciously compromising accuracy of other people’s beliefs is usually called lying, or at least not technically lying.
I think that may be the most roundabout and head-spinny justification for self-deception I’ve ever heard. Wow. By a similar token, should I not take up gardening if it’s not within my power to update everyone who has the belief that I don’t garden?
Note that I don’t endorse self-deception, see my other comment in this thread. But the argument points to a negative trait of the choice. (The argument is related to a stance that as a rationalist, you’d want to use rhetoric as much as is common (but not more), to avoid signaling the incorrect fact of weakness of your position.)
Normally, if you take up gardening, other people’s level of belief will either be unchanged (prior state of knowledge: they don’t have new evidence), or will move up (towards the truth) upon receiving new evidence. Here, the situation is reversed: new evidence (not new action—this is a point where your analogy breaks) will move people’s belief away from the truth.