I’m guessing that the rule P(A & B) < P(A) is for independent variables (though it’s actually more accurate to say P(A & B) ⇐ P(A) ). If you have dependent variables, then you use Bayes Theorem to update. P(A & B) is different from P(A | B). P(A & B) ⇐ P(A) is always true, but not so for P(A | B) viz. P(A).
This is probably an incomplete or inadequate explanation, though. I think there was a thread about this a long time ago, but I can’t find it. My Google-fu is not that strong.
I’m guessing that the rule P(A & B) < P(A) is for independent variables (though it’s actually more accurate to say P(A & B) ⇐ P(A) ). If you have dependent variables, then you use Bayes Theorem to update. P(A & B) is different from P(A | B). P(A & B) ⇐ P(A) is always true, but not so for P(A | B) viz. P(A).
This is probably an incomplete or inadequate explanation, though. I think there was a thread about this a long time ago, but I can’t find it. My Google-fu is not that strong.