Why should evidence of bias be some evidence against a belief? This would be like magic: using someone’s failure of rationality to learn something about the world, which is absurd. (Example: Federer’s wife is very confident that he will win, because she is biased in his favor. Does this give me any reason to bet against Federer? Obviously not.)
If you find out that someone believes A then that’s evidence for A, so your beliefs change away from the priors. If you subsequently find that the person is likely biased then your beliefs return some way toward your priors. So finding out about the bias was in some sense evidence about A.
To be precise, knowing that someone is biased towards holding a belief decreases the amount you should update your own beliefs in response to theirs — because it decreases the likelihood ratio of the test.
(That is, having a bias towards a belief means people are more likely to believe it when it isn’t true (more false positives), so a bias-influenced belief is less likely to be true and therefore weaker evidence. In Bayesian terms, bias increases P(B) without increasing P(B|A), so it decreases P(A|B).)
So CarmendeMacedo’s right that you can’t get evidence about the world from knowledge of a person’s biases, but you should decrease your confidence if you discover a bias, because it means you had the wrong priors when you updated the first time.
Why should evidence of bias be some evidence against a belief? This would be like magic: using someone’s failure of rationality to learn something about the world, which is absurd. (Example: Federer’s wife is very confident that he will win, because she is biased in his favor. Does this give me any reason to bet against Federer? Obviously not.)
If you find out that someone believes A then that’s evidence for A, so your beliefs change away from the priors. If you subsequently find that the person is likely biased then your beliefs return some way toward your priors. So finding out about the bias was in some sense evidence about A.
To be precise, knowing that someone is biased towards holding a belief decreases the amount you should update your own beliefs in response to theirs — because it decreases the likelihood ratio of the test.
(That is, having a bias towards a belief means people are more likely to believe it when it isn’t true (more false positives), so a bias-influenced belief is less likely to be true and therefore weaker evidence. In Bayesian terms, bias increases P(B) without increasing P(B|A), so it decreases P(A|B).)
So CarmendeMacedo’s right that you can’t get evidence about the world from knowledge of a person’s biases, but you should decrease your confidence if you discover a bias, because it means you had the wrong priors when you updated the first time.