I assume that by “Bayes factor” you mean the net effect on the posterior probability).
I’m using the standard meaning: for a hypothesis H and evidence E, the bayes factor is p(E|H)/p(E|~H). It’s easiest to think of it as the factor you mutiply your prior odds to get posterior odds. (Odds, not probabilities.) Which means I goofed and said “positive” when I meant “above unity” :-/
I read Tyler as not knowing what he’s talking about. For one thing, do you notice how he’s trying to justify why something should have p>0 under a Bayesian analysis … when Bayesian inference already requires p’s to be greater than zero?
In his original post, he was explaining a scenario under which seeing fraud should make you raise your p(AGW). Though he’s not thinking clearly enough to say it, this is equivalent to describing a scenario under which the Bayes factor is greater than unity. (I admit I probably shouldn’t have said “argument for >1 Bayes factor”, but rather, “suggestion of plausibility of >1 Bayes factor”)
That’s the charitable interpretation of what he said. If he didn’t mean that, as you seem to think, then he’s presenting metrics that aren’t helpful, and this is clear when he think’s its some profound insight to put p(fraud due to importance of issue) greater than zero. Yes, there are cases where AGW is true despite this evidence—but what’s the impact on the Bayes factor?
I think we are arguing past each other, but it’s about interpreting someone else so I’m not that worried about it. I’ll add one more bullet to your list to clarify what I think Tyler is saying. If that doesn’t resolve it, oh well.
If we know with certainty that the secenario that Tyler described is true, that is if we know that the scientists fudged things because they knew that AGW was real and that the consequences were worth risking their reputations on, then Climategate has a Bayes factor above 1.
I don’t think Tyler was saying anything more than that. (Well, and P(his scenario) is non-negligible)
I’m using the standard meaning: for a hypothesis H and evidence E, the bayes factor is p(E|H)/p(E|~H). It’s easiest to think of it as the factor you mutiply your prior odds to get posterior odds. (Odds, not probabilities.) Which means I goofed and said “positive” when I meant “above unity” :-/
I read Tyler as not knowing what he’s talking about. For one thing, do you notice how he’s trying to justify why something should have p>0 under a Bayesian analysis … when Bayesian inference already requires p’s to be greater than zero?
In his original post, he was explaining a scenario under which seeing fraud should make you raise your p(AGW). Though he’s not thinking clearly enough to say it, this is equivalent to describing a scenario under which the Bayes factor is greater than unity. (I admit I probably shouldn’t have said “argument for >1 Bayes factor”, but rather, “suggestion of plausibility of >1 Bayes factor”)
That’s the charitable interpretation of what he said. If he didn’t mean that, as you seem to think, then he’s presenting metrics that aren’t helpful, and this is clear when he think’s its some profound insight to put p(fraud due to importance of issue) greater than zero. Yes, there are cases where AGW is true despite this evidence—but what’s the impact on the Bayes factor?
Why should we care about arbitrarily small probabilities?
Tyler was not misunderstood: he used probability and Bayesian inference incorrectly and vacuously, then tried to backpedal. ( My comment on page 2.)
Anyway, I think we agree on the substance:
The fact that the p Tyler referred to is greater than zero is insufficient information to know how to update.
The scenario Tyler described is insufficient to give Climategate a Bayes factor above 1.
(I was going to the drop the issue, but you seem serious about de-Aumanning this, so I gave a full reply.)
I think we are arguing past each other, but it’s about interpreting someone else so I’m not that worried about it. I’ll add one more bullet to your list to clarify what I think Tyler is saying. If that doesn’t resolve it, oh well.
If we know with certainty that the secenario that Tyler described is true, that is if we know that the scientists fudged things because they knew that AGW was real and that the consequences were worth risking their reputations on, then Climategate has a Bayes factor above 1.
I don’t think Tyler was saying anything more than that. (Well, and P(his scenario) is non-negligible)