I think it’s pretty well-established here that having accurate beliefs shouldn’t actually hurt you.
Not at all. It is well established having accurate beliefs should not hurt a perfect bayesian intelligence. Believing it applied to mere humans would be naive in the extreme.
It’s not a good strategy to change your actual beliefs so that you can signal more effectively—and it probably wouldn’t work, anyway.
The fact that we are so damn good at it is evidence to the contrary!
I’m not understanding the disagreement here. I’ll grant that imperfect knowledge can be harmful, but is anybody really going to argue that it isn’t useful to try to have the most accurate map of the territory?
Not at all. It is well established having accurate beliefs should not hurt a perfect bayesian intelligence. Believing it applied to mere humans would be naive in the extreme.
The fact that we are so damn good at it is evidence to the contrary!
I’m not understanding the disagreement here. I’ll grant that imperfect knowledge can be harmful, but is anybody really going to argue that it isn’t useful to try to have the most accurate map of the territory?
We are talking about signalling. So for most people yes.