Yes, this in an interesting issue. One unusual (at least, I have not seen anyone advocate it seriously elsewhere) perspective is that mentioned by Tyler Cowen here. The gist is that in Bayesian terms, the fact that someone thought an issue was important enough to lie about is evidence that their claim is correct.
Hmmm. It’s better evidence that they want you to believe the claim is correct.
For example, I might cherry-pick evidence to suggest that anyone who gives me $1 is significantly less likely to be killed by a crocodile. I don’t believe that myself, but it is to my advantage that you believe it, because then I am likely to get $1.
The Bayesian point only stands if the P(ClimateGate | AGW) > P(ClimateGate | ~AGW). That is the only way you can revise your prior upwards in light of ClimateGate
Yes, this in an interesting issue. One unusual (at least, I have not seen anyone advocate it seriously elsewhere) perspective is that mentioned by Tyler Cowen here. The gist is that in Bayesian terms, the fact that someone thought an issue was important enough to lie about is evidence that their claim is correct.
Or their position on the issue could be motivated by some other issue you don’t even know is on their agenda.
Or...pretty much anything.
Hmmm. It’s better evidence that they want you to believe the claim is correct.
For example, I might cherry-pick evidence to suggest that anyone who gives me $1 is significantly less likely to be killed by a crocodile. I don’t believe that myself, but it is to my advantage that you believe it, because then I am likely to get $1.
Someone points out in the comments to that: