Log odds of independent events do not add up, just as the odds of independent events do not multiply. The odds of flipping heads is 1:1, the odds of flipping heads twice is not 1:1 (you have to multiply odds by likelihood ratios, not odds by odds, and likewise you don’t add log odds and log odds, but log odds and log likelihood-ratios). So calling log odds themselves “evidence” doesn’t fit the way people use the word “evidence” as something that “adds up”.
I’m voting your comment up, because I think it’s a great example of how terminology should be chosen and used carefully. If you decide to edit it, I think it would be most helpful if you left your original words as a warning to others :)
By “evidence”, I refer to events that change an agent’s strength of belief in a theory, and the measure of evidence is the measure of this change in belief, that is, the likelihood-ratio and log likelihood-ratio you refer to.
I never meant for “evidence” to refer to the posterior strength of belief. “Log odds” was only meant to specify a particular measurement of strength in belief.
Can you be clearer? Log likelihood ratios do add up, so long as the independence criterion is satisfied (ie so long as P(E_2|H_x) = P(E_2|E_1,H_x) for each H_x).
Sure, just edited in the clarification: “you have to multiply odds by likelihood ratios, not odds by odds, and likewise you don’t add log odds and log odds, but log odds and log likelihood-ratios”.
It explains “mutual information”, i.e. “informational evidence”, which can be added up over as many independent events as you like. Hopefully this will have restorative effects for your intuition!
Log odds of independent events do not add up, just as the odds of independent events do not multiply. The odds of flipping heads is 1:1, the odds of flipping heads twice is not 1:1 (you have to multiply odds by likelihood ratios, not odds by odds, and likewise you don’t add log odds and log odds, but log odds and log likelihood-ratios). So calling log odds themselves “evidence” doesn’t fit the way people use the word “evidence” as something that “adds up”.
This terminology may have originated here:
http://causalityrelay.wordpress.com/2008/06/23/odds-and-intuitive-bayes/
I’m voting your comment up, because I think it’s a great example of how terminology should be chosen and used carefully. If you decide to edit it, I think it would be most helpful if you left your original words as a warning to others :)
By “evidence”, I refer to events that change an agent’s strength of belief in a theory, and the measure of evidence is the measure of this change in belief, that is, the likelihood-ratio and log likelihood-ratio you refer to.
I never meant for “evidence” to refer to the posterior strength of belief. “Log odds” was only meant to specify a particular measurement of strength in belief.
Can you be clearer? Log likelihood ratios do add up, so long as the independence criterion is satisfied (ie so long as P(E_2|H_x) = P(E_2|E_1,H_x) for each H_x).
Sure, just edited in the clarification: “you have to multiply odds by likelihood ratios, not odds by odds, and likewise you don’t add log odds and log odds, but log odds and log likelihood-ratios”.
As long as there are only two H_x, mind you. They no longer add up when you have three hypotheses or more.
Indeed—though I find it very hard to hang on to my intuitive grasp of this!
Here is the post on information theory I said I would write:
http://lesswrong.com/lw/1y9/information_theory_and_the_symmetry_of_updating/
It explains “mutual information”, i.e. “informational evidence”, which can be added up over as many independent events as you like. Hopefully this will have restorative effects for your intuition!
Don’t worry, I have an information theory post coming up that will fix all of this :)