To be precise, knowing that someone is biased towards holding a belief decreases the amount you should update your own beliefs in response to theirs — because it decreases the likelihood ratio of the test.
(That is, having a bias towards a belief means people are more likely to believe it when it isn’t true (more false positives), so a bias-influenced belief is less likely to be true and therefore weaker evidence. In Bayesian terms, bias increases P(B) without increasing P(B|A), so it decreases P(A|B).)
So CarmendeMacedo’s right that you can’t get evidence about the world from knowledge of a person’s biases, but you should decrease your confidence if you discover a bias, because it means you had the wrong priors when you updated the first time.
To be precise, knowing that someone is biased towards holding a belief decreases the amount you should update your own beliefs in response to theirs — because it decreases the likelihood ratio of the test.
(That is, having a bias towards a belief means people are more likely to believe it when it isn’t true (more false positives), so a bias-influenced belief is less likely to be true and therefore weaker evidence. In Bayesian terms, bias increases P(B) without increasing P(B|A), so it decreases P(A|B).)
So CarmendeMacedo’s right that you can’t get evidence about the world from knowledge of a person’s biases, but you should decrease your confidence if you discover a bias, because it means you had the wrong priors when you updated the first time.