I think I observe this generally a lot: “as soon as those implications do not personally benefit them”, and even more so when this comes with a cost/conflict of interest.
On rationality on decision making (not the seeking truth part on belief forming I guess) - I thought it is more like being consistent with their own preference and values (if we are constraining to the definition on lesswrong/sequence ish)? I have a hot take that:
If the action space of commit to a belief is a binary choice, then when people do not commit to a belief, the degree they believe in that belief is less than those who do. If we have to make it into binary classification, then it is not really a true belief if they do not commit to that belief.
It could be the action of a belief is a spectrum, and then people in this case for example could eat less meat, matching the degree of belief “eating meat is not moral”.
I think I observe this generally a lot: “as soon as those implications do not personally benefit them”, and even more so when this comes with a cost/conflict of interest.
On rationality on decision making (not the seeking truth part on belief forming I guess) - I thought it is more like being consistent with their own preference and values (if we are constraining to the definition on lesswrong/sequence ish)? I have a hot take that:
If the action space of commit to a belief is a binary choice, then when people do not commit to a belief, the degree they believe in that belief is less than those who do. If we have to make it into binary classification, then it is not really a true belief if they do not commit to that belief.
It could be the action of a belief is a spectrum, and then people in this case for example could eat less meat, matching the degree of belief “eating meat is not moral”.