You can choose to think of signaling beliefs as lying, but that’s not very helpful to anyone. It’s what most people do naturally and therefore not a violation of anyone’s expectations in most contexts. Maybe instead it should be called speaking Statusese.
People don’t pick up on the literal truth of your statements but on your own belief that you are doing something wrong. For instance, writers of fiction aren’t typically considered immoral liars.
If the whole world claims to be vegan and then eats spam, and moreover sees this as completely normal and expected, and sees people who don’t do it as weird and untrustworthy, what exactly are you accomplishing by refusing to go along with it?
Some of us have trouble keeping near and far modes separate. People like us if we try professing veganism, will find ourselves ending up not eating spam.
My personal solution is to lie, I’m actually quite good at it!
And ideally, you’d take that fact into account in forming your actual beliefs. I think it’s pretty well-established here that having accurate beliefs shouldn’t actually hurt you. It’s not a good strategy to change your actual beliefs so that you can signal more effectively—and it probably wouldn’t work, anyway.
I haven’t read that paper—but thanks for the link, I’ll definitely do so—but it seems that that’s a separate issue from choosing which beliefs to have based on what it will do for your social status. Still, I would argue that limiting knowledge is only preferable in select cases—not a good general rule to abide by, partial knowledge of biases and such notwithstanding.
I think it’s pretty well-established here that having accurate beliefs shouldn’t actually hurt you.
Not at all. It is well established having accurate beliefs should not hurt a perfect bayesian intelligence. Believing it applied to mere humans would be naive in the extreme.
It’s not a good strategy to change your actual beliefs so that you can signal more effectively—and it probably wouldn’t work, anyway.
The fact that we are so damn good at it is evidence to the contrary!
I’m not understanding the disagreement here. I’ll grant that imperfect knowledge can be harmful, but is anybody really going to argue that it isn’t useful to try to have the most accurate map of the territory?
...though it is also worth noting that humans are evolved to be reasonable lie-detectors.
If your actual beliefs don’t match your signalled beliefs, others may pick up on that, expose you as a liar, and punish you.
You can choose to think of signaling beliefs as lying, but that’s not very helpful to anyone. It’s what most people do naturally and therefore not a violation of anyone’s expectations in most contexts. Maybe instead it should be called speaking Statusese.
People don’t pick up on the literal truth of your statements but on your own belief that you are doing something wrong. For instance, writers of fiction aren’t typically considered immoral liars.
People will agree to fiction not being true, but not to their professed beliefs not being true.
Signalling beliefs that don’t match your actual beliefs is what I said and meant.
Like claiming to be a vegan, and then eating spam.
If the whole world claims to be vegan and then eats spam, and moreover sees this as completely normal and expected, and sees people who don’t do it as weird and untrustworthy, what exactly are you accomplishing by refusing to go along with it?
Some of us have trouble keeping near and far modes separate. People like us if we try professing veganism, will find ourselves ending up not eating spam.
My personal solution is to lie, I’m actually quite good at it!
What does that have to do with the topic? That was just an example of signalling beliefs that don’t match your actual beliefs.
One could as easily say that it isn’t useful to consider lying from the viewpoint of morality.
And ideally, you’d take that fact into account in forming your actual beliefs. I think it’s pretty well-established here that having accurate beliefs shouldn’t actually hurt you. It’s not a good strategy to change your actual beliefs so that you can signal more effectively—and it probably wouldn’t work, anyway.
Hmm: Information Hazards: A Typology of Potential Harms from Knowledge …?
I haven’t read that paper—but thanks for the link, I’ll definitely do so—but it seems that that’s a separate issue from choosing which beliefs to have based on what it will do for your social status. Still, I would argue that limiting knowledge is only preferable in select cases—not a good general rule to abide by, partial knowledge of biases and such notwithstanding.
Not at all. It is well established having accurate beliefs should not hurt a perfect bayesian intelligence. Believing it applied to mere humans would be naive in the extreme.
The fact that we are so damn good at it is evidence to the contrary!
I’m not understanding the disagreement here. I’ll grant that imperfect knowledge can be harmful, but is anybody really going to argue that it isn’t useful to try to have the most accurate map of the territory?
We are talking about signalling. So for most people yes.