10. Overwhelming correlation between views about other people’s beliefs and their intellectual honesty (bad)
1) What does #10 mean? (Whose intellectual honesty is being considered—the expert you’re evaluating, or other people whose beliefs the expert has views about?) Can you give an example of this one?
12. Prescribing malicious intent to groups of people that haven’t been selected through a plausible filter (bad)
2) For #12, what do you mean about the filter—do you mean: A) “Ascribing malicious intent to people, even though those people haven’t been selected for maliciousness”, or B) “Ascribing malicious intent to people because those people haven’t been selected for non-maliciousness”?
(Also, I think you meanascribing rather than prescribing for both #9 and #12.)
(Also, I think you meanascribing rather than prescribing for both #9 and #12.)
Yes, I do. Non-native speaker. Clearly there are more issues with clarity than I had thought, so thank you for this comment.
@12 I mean A). It seems clear to me that very few people are actually malicious, and that not being aware of that on some level is a signal for incompetence and low trustworthiness. An example here is believing that anyone who wants stronger borders must dislike certain cultures.
@10: other people the expert has views about. This is about doubting the sincerity of people who disagree with you on emotionally charged topics. Say X and Y are public people who disagree on the minimum wage, and say person A declares X who happens to agree with her position to me more honest. If A does this too much, that makes me update downward on their trustworthiness.
I have the word overwhelming in there because I think there can be a real correlation between having one position on a question and being more honest, so if A just does this occasionally, that can be fine. But particularly if there are no exceptions, if people in A’s team are consistently the good guys, that’s a bad signal.
Intellectual honesty is a bad term here because it’s too narrow. I’ll look for something better.
Got it. Two more questions:
1) What does #10 mean? (Whose intellectual honesty is being considered—the expert you’re evaluating, or other people whose beliefs the expert has views about?) Can you give an example of this one?
2) For #12, what do you mean about the filter—do you mean: A) “Ascribing malicious intent to people, even though those people haven’t been selected for maliciousness”, or B) “Ascribing malicious intent to people because those people haven’t been selected for non-maliciousness”?
(Also, I think you mean ascribing rather than prescribing for both #9 and #12.)
Yes, I do. Non-native speaker. Clearly there are more issues with clarity than I had thought, so thank you for this comment.
@12 I mean A). It seems clear to me that very few people are actually malicious, and that not being aware of that on some level is a signal for incompetence and low trustworthiness. An example here is believing that anyone who wants stronger borders must dislike certain cultures.
@10: other people the expert has views about. This is about doubting the sincerity of people who disagree with you on emotionally charged topics. Say X and Y are public people who disagree on the minimum wage, and say person A declares X who happens to agree with her position to me more honest. If A does this too much, that makes me update downward on their trustworthiness.
I have the word overwhelming in there because I think there can be a real correlation between having one position on a question and being more honest, so if A just does this occasionally, that can be fine. But particularly if there are no exceptions, if people in A’s team are consistently the good guys, that’s a bad signal.
Intellectual honesty is a bad term here because it’s too narrow. I’ll look for something better.
Thanks for the clarification, makes sense to me now!