I disagree… I think “limited analysis resources” accounts for the very difference you speak of. I think the “rituals of cognition” you mention are themselves subjection to rationality analysis: if I’m understanding you correctly, you are talking about someone who knows how to be rational in theory but cannot implement such theory in practice. I think you run into three possibilities there.
One, the person has insufficient analytical resources to translate their theory into action, which Robin accounts for. The person is still rational, given their budget constraint.
Two, the person could gain the ability to make the proper translation, but the costs of doing so are so high that the person is better off with the occasional translation error. The person rationally chooses not to learn better translation techniques.
Three, the person systematically makes mistakes in the translations. That, I think, we can fairly call a bias, which is what we’re trying to avoid here. The person is acting irrationally—if there is a predictable bias, it should have been corrected for.
On your last point: “[Robin would] give someone “rationality points” for coming up with a better algorithm that requires less clock cycles, while I would just give them “cleverness points”.” I think I have to side with Robin here. On certain issues it might not matter how quickly or efficiently the rational result is arrived at, but I think in almost all situations coming up with a faster way to arrive at a rational result is more rational, since individuals face constraints of time and resources. While the faster algorithm isn’t more rational on a single, isolated issue [assuming they both lead to the same rational result], the person would be able to move on to a different issue faster and thus have more resources available to be more rational in a different setting.
We’re forgetting signaling. Robin would never forgive us, because he sees it in a lot of things, and I happen to agree with him that it’s far more pervasive than people think.
In fact, the Tversky example gives people two opportunities to signal: not only do they get to demonstrate higher pain tolerance [especially important for men], they also get to “demonstrate” a healthier heart. Both should be boosts in status.
The same goes for Calvinists: though, when you think about it, you truly believe in the elect, you don’t think about it most of your life [as we know, much of our day to day life is subconsciously guided] and are instead focused on signaling your elect status with a good life.
For good measure, it even works with the car: you buy a new car to signal wealth to signal health.
However, I do believe that we engage in lots of automatic self-deception [making it easier to deceive others into believing we have higher status]: thus, we may actually believe that an extra car/a good life/a higher pain tolerance would improve your life expectancy/grace/heart, but that’s merely the proximate cause. Ultimately, we’re driven by status-seeking.