The comment shouldn’t make you feel like a weirdo. I made the comment, and I fully agree with your opinions. I don’t think I have ever feigned beliefs for approval, and in the rare cases where I keep quiet and don’t express opposing beliefs, I feel horrible. However I do think this is a huge issue for the rest of the world, and less wrong has a lot of opinions of the form “X is obvious and if you don’t believe X you are crazy” which makes me worry that this might be an issue for some here. Also, your comment mostly just says 3 is bad. I think the more interesting question is probably 1 or 2.
Feigning beliefs for approval is something I suppose I’ve engaged in a time or two, and to the best of my recollection, quickly felt the need for a shower
I agree—there is rarely conscious feigning of belief for want of benefits. People who do this are actually “evil” mutants. A neurotical human would feel guilty, even neurotypical humans who don’t actually care about truth feel guilty when they lie.
in the rare cases where I keep quiet and don’t express opposing beliefs, I feel horrible. However I do think this is a huge issue for the rest of the world
The rest of the world are not evil mutants—see here. They behave as if they are, but they are operating using similar cognitive machinery as yourself. So here are some alternative explanations for the behavior:
1) There is often withholding beliefs for fear of social repercussions. Lies by omission are less guilt inducing.
2) There is often being sincerely convinced (or sincerely uncertain about something you would be certain about) for the wrong reason—because your social group believes it. This is most common for beliefs that have no practical consequences, or beliefs for which the practical consequences are non-obvious.
3) There is often belief-in-belief (thinking that you believe in accordance with your social group, but then not acting that way and failing to see the contradiction.) Basically self-deception: there is an instinctive imperative not to lie, it doesn’t matter if we make that an explicit norm.
It’s hard to actually notice when you are doing these things.
But I think you’re still correct that changing the social incentive structures would change the behavior.
I disagree. Oh, they’re not so evil, but they may as well be a different species.
You know how Haidt has different moral modalities, with different people putting consistently different weights to those modalities? I think the same thing occurs with truth modalities. For some people, the truth is whatever is socially useful. To such a creature, values that I hold dear such as honesty and knowledge are simply alien and irrelevant issues. To me, they are Clippy, looking to turn me into a paperclip. And the bad news for me is that on this planet, I’m the weirdo alien from another planet, not them.
I’m saying that people do value honesty, but can’t pursue it as a value effectively because of faulty cognitive machinery, poor epistemic skills, and a dislike of admitting (even to themselves) that they are wrong. I think that the Lesswrong / rationalist / skeptic community tends to be comprised of folks with superior cognitive machinery and epistemic skills in this dimension. When people say “I value honesty” they believe that they are speaking honestly, even if they aren’t entirely sure what truth means.
As I see it, you’re saying that people do not value honesty and purposefully choose to ignore it in favor of other, more instrumental values. And you extend this trait to the Lesswrong / rationalist / skeptic community as well. When people say “I value honesty”, in their mind they know it to be a lie but do not care. If they were to ever say “I consider truth to be whatever is socially useful”, in their mind they would believe that this an honest statement.
Both our hypotheses explain the same phenomenon. My mental disagreement flowchart says that it is time to ask the following questions:
0) Did I state your point and derive its logical implications correctly? Do you find my point coherent, even if it’s wrong?
1) Do you have evidence (anecdotal or otherwise) which favors your hypothesis above mine?
(My evidence is that neurotypical humans experience guilt when being dishonest, and this makes being dishonest difficult. Do you dispute the truth of this evidence? Alternatively, do you dispute that this evidence increases the likelihood of my hypothesis?)
2) Do you stake a claim to parsimony? I do, since my hypothesis relies entirely on what we already know about biases and variations in the ability to think logically.
1) There is often withholding beliefs for fear of social repercussions. Lies by omission are less guilt inducing.
This is exactly the phenomenon that I was trying to say is a big problem for a lot of people. I do not think that the rest of the world directly lies that much, but I do think they lie by omission a lot because of social pressure.
When I talk about dishonest signalling with beliefs, I am including both lies by omission like in 1, and subconscious lies like 3. 2, however, is an entirely different issue.
less wrong has a lot of opinions of the form “X is obvious and if you don’t believe X you are crazy”
This strikes me as a problem of presentation more than anything else. I’ve had computer science professors whose lecture style contained a lot of “X is obvious and if you don’t believe X you are crazy”—which was extremely jarring at first, as I came into a CS graduate program from a non-CS background, and didn’t have a lot of the formal/academic background that my classmates did. Once I brought myself up to speed, I had the methods I needed to evaluate the obviousness and validity of various Xs, but until then, I sure didn’t open my mouth in class a lot.
In the classes I TAed, I strove for a lecture style of “X is the case and if you don’t understand why then it’s my job to help you connect the dots.” That was for 101-level classes, which LW is, at least to my impression, not; if the Sequences are the curriculum for undergraduate rationality, LW is kind of like the grad student lounge. But not, like, a snooty exclusive one, anyone’s welcome to hang out and contribute. Still, the focus is on contribute, so that’s at least perceived social pressure to perform up to a certain standard—for me it’s experientially very similar to the grad school experience I described.
We’re talking about opinions here rather than theorems, and there’s a distinction I want to draw between opinions that are speculations about something and opinions that are … personal? … but I’m having trouble articulating it; I will try again later, but wanted to point out that this experience you describe generalizes beyond LW.
The comment shouldn’t make you feel like a weirdo. I made the comment, and I fully agree with your opinions. I don’t think I have ever feigned beliefs for approval, and in the rare cases where I keep quiet and don’t express opposing beliefs, I feel horrible. However I do think this is a huge issue for the rest of the world, and less wrong has a lot of opinions of the form “X is obvious and if you don’t believe X you are crazy” which makes me worry that this might be an issue for some here. Also, your comment mostly just says 3 is bad. I think the more interesting question is probably 1 or 2.
I agree—there is rarely conscious feigning of belief for want of benefits. People who do this are actually “evil” mutants. A neurotical human would feel guilty, even neurotypical humans who don’t actually care about truth feel guilty when they lie.
The rest of the world are not evil mutants—see here. They behave as if they are, but they are operating using similar cognitive machinery as yourself. So here are some alternative explanations for the behavior:
1) There is often withholding beliefs for fear of social repercussions. Lies by omission are less guilt inducing.
2) There is often being sincerely convinced (or sincerely uncertain about something you would be certain about) for the wrong reason—because your social group believes it. This is most common for beliefs that have no practical consequences, or beliefs for which the practical consequences are non-obvious.
3) There is often belief-in-belief (thinking that you believe in accordance with your social group, but then not acting that way and failing to see the contradiction.) Basically self-deception: there is an instinctive imperative not to lie, it doesn’t matter if we make that an explicit norm.
It’s hard to actually notice when you are doing these things.
But I think you’re still correct that changing the social incentive structures would change the behavior.
I disagree. Oh, they’re not so evil, but they may as well be a different species.
You know how Haidt has different moral modalities, with different people putting consistently different weights to those modalities? I think the same thing occurs with truth modalities. For some people, the truth is whatever is socially useful. To such a creature, values that I hold dear such as honesty and knowledge are simply alien and irrelevant issues. To me, they are Clippy, looking to turn me into a paperclip. And the bad news for me is that on this planet, I’m the weirdo alien from another planet, not them.
I’m saying that people do value honesty, but can’t pursue it as a value effectively because of faulty cognitive machinery, poor epistemic skills, and a dislike of admitting (even to themselves) that they are wrong. I think that the Lesswrong / rationalist / skeptic community tends to be comprised of folks with superior cognitive machinery and epistemic skills in this dimension. When people say “I value honesty” they believe that they are speaking honestly, even if they aren’t entirely sure what truth means.
As I see it, you’re saying that people do not value honesty and purposefully choose to ignore it in favor of other, more instrumental values. And you extend this trait to the Lesswrong / rationalist / skeptic community as well. When people say “I value honesty”, in their mind they know it to be a lie but do not care. If they were to ever say “I consider truth to be whatever is socially useful”, in their mind they would believe that this an honest statement.
Both our hypotheses explain the same phenomenon. My mental disagreement flowchart says that it is time to ask the following questions:
0) Did I state your point and derive its logical implications correctly? Do you find my point coherent, even if it’s wrong?
1) Do you have evidence (anecdotal or otherwise) which favors your hypothesis above mine?
(My evidence is that neurotypical humans experience guilt when being dishonest, and this makes being dishonest difficult. Do you dispute the truth of this evidence? Alternatively, do you dispute that this evidence increases the likelihood of my hypothesis?)
2) Do you stake a claim to parsimony? I do, since my hypothesis relies entirely on what we already know about biases and variations in the ability to think logically.
This is exactly the phenomenon that I was trying to say is a big problem for a lot of people. I do not think that the rest of the world directly lies that much, but I do think they lie by omission a lot because of social pressure.
When I talk about dishonest signalling with beliefs, I am including both lies by omission like in 1, and subconscious lies like 3. 2, however, is an entirely different issue.
This strikes me as a problem of presentation more than anything else. I’ve had computer science professors whose lecture style contained a lot of “X is obvious and if you don’t believe X you are crazy”—which was extremely jarring at first, as I came into a CS graduate program from a non-CS background, and didn’t have a lot of the formal/academic background that my classmates did. Once I brought myself up to speed, I had the methods I needed to evaluate the obviousness and validity of various Xs, but until then, I sure didn’t open my mouth in class a lot.
In the classes I TAed, I strove for a lecture style of “X is the case and if you don’t understand why then it’s my job to help you connect the dots.” That was for 101-level classes, which LW is, at least to my impression, not; if the Sequences are the curriculum for undergraduate rationality, LW is kind of like the grad student lounge. But not, like, a snooty exclusive one, anyone’s welcome to hang out and contribute. Still, the focus is on contribute, so that’s at least perceived social pressure to perform up to a certain standard—for me it’s experientially very similar to the grad school experience I described.
We’re talking about opinions here rather than theorems, and there’s a distinction I want to draw between opinions that are speculations about something and opinions that are … personal? … but I’m having trouble articulating it; I will try again later, but wanted to point out that this experience you describe generalizes beyond LW.