Being able to read off patterns of conditional dependence and independence is an art known as “D-separation”, and if you’re good at it you can glance at a diagram like this...
In order to get better at this, I recommend downloading and playing around with UnBBayes. Here’s a brief video tutorial of the basic usage. The program is pretty buggy—for example, sometimes it randomly refuses to compile a file and then starts working after I close the file and reopen it—but that’s more of a trivial inconvenience than a major problem.
What’s great about UnBBayes is that it allows you to construct a network and then show how the probability flows around it; you can also force some variable to be true or false and see how this affects the surrounding probabilities. For example, here I’ve constructed a copy of the “Season” network from the post, filled it with conditional probabilities I made up, and asked the program to calculate the overall probabilities. (This was no tough feat—it took me maybe five minutes, most of which I spent on making up the probabilities.)
So, we know the season: let’s say that we know it’s fall. I tell the program to assume that it’s fall, and ask it to propagate the effects of this throughout the network. We can see how this changes the probabilites of the different variables:
whether the Sprinkler is on and whether it is Raining are conditionally independent of each other—if we’re told that it’s Raining we conclude nothing about whether or not the Sprinkler is on.
The wording here is a little ambiguous, but Eliezer’s saying that with our knowledge of the season, the variables of “Sprinkler” and “Rain” have become independent. Finding out that it rains shouldn’t change the probability of the sprinkler being on. Let’s test this by setting it to rain and again propagating the effects:
And indeed, the probability of it being wet increased, but the probability of the sprinkler being on didn’t change.
But if we then further observe that the sidewalk is Slippery, then Sprinkler and Rain become conditionally dependent once more, because if the Sidewalk is Slippery then it is probably Wet and this can be explained by either the Sprinkler or the Rain but probably not both, i.e. if we’re told that it’s Raining we conclude that it’s less likely that the Sprinkler was on.
So let’s put the sidewalk to “slippery”, and unset “rain” again. I’ve defined the network so that the sidewalk is never slippery unless it’s wet, so setting the sidewalk to “slippery” forces the probability of “wet” to 100%.
(It occurs to me that even if the sidewalk wasn’t wet, it could be slippery if it was covered by leaves, or with ice. So there should actually be an arrow from “season” to “slippery”. That would be trivial to add.)
Another great thing about UnBBayes is that it not only helps you understand the direction of the probability flows, but also the magnitude of different kinds of changes. Depending on how you’ve set up the conditional probabilities, a piece of information can have a huge impact on another variable (the sidewalk being slippery always forces the probability of “wet” to 100%, regardless of anything else), or a rather minor one (when we already knew that it was fall and slippery, finding out that it rained only budged the probability of the sprinkler by about five percentage points). Eventually, the logic starts to become intuitive.
Build your own networks and play around with them!
In order to get better at this, I recommend downloading and playing around with UnBBayes. Here’s a brief video tutorial of the basic usage. The program is pretty buggy—for example, sometimes it randomly refuses to compile a file and then starts working after I close the file and reopen it—but that’s more of a trivial inconvenience than a major problem.
What’s great about UnBBayes is that it allows you to construct a network and then show how the probability flows around it; you can also force some variable to be true or false and see how this affects the surrounding probabilities. For example, here I’ve constructed a copy of the “Season” network from the post, filled it with conditional probabilities I made up, and asked the program to calculate the overall probabilities. (This was no tough feat—it took me maybe five minutes, most of which I spent on making up the probabilities.)
http://kajsotala.fi/Random/UnBBayesExample1.png
Let’s now run through Eliezer’s explanation:
So, we know the season: let’s say that we know it’s fall. I tell the program to assume that it’s fall, and ask it to propagate the effects of this throughout the network. We can see how this changes the probabilites of the different variables:
http://kajsotala.fi/Random/UnBBayesExample2.png
The wording here is a little ambiguous, but Eliezer’s saying that with our knowledge of the season, the variables of “Sprinkler” and “Rain” have become independent. Finding out that it rains shouldn’t change the probability of the sprinkler being on. Let’s test this by setting it to rain and again propagating the effects:
http://kajsotala.fi/Random/UnBBayesExample3.png
And indeed, the probability of it being wet increased, but the probability of the sprinkler being on didn’t change.
So let’s put the sidewalk to “slippery”, and unset “rain” again. I’ve defined the network so that the sidewalk is never slippery unless it’s wet, so setting the sidewalk to “slippery” forces the probability of “wet” to 100%.
http://kajsotala.fi/Random/UnBBayesExample4.png
Now let’s see the effect on probabilities if we set it to rain—as Eliezer predicted, the probability of the sprinkler then goes down:
http://kajsotala.fi/Random/UnBBayesExample5.png
And vice versa, if we force the sprinkler to be on, the probability of it raining goes down:
http://kajsotala.fi/Random/UnBBayesExample6.png
(It occurs to me that even if the sidewalk wasn’t wet, it could be slippery if it was covered by leaves, or with ice. So there should actually be an arrow from “season” to “slippery”. That would be trivial to add.)
Another great thing about UnBBayes is that it not only helps you understand the direction of the probability flows, but also the magnitude of different kinds of changes. Depending on how you’ve set up the conditional probabilities, a piece of information can have a huge impact on another variable (the sidewalk being slippery always forces the probability of “wet” to 100%, regardless of anything else), or a rather minor one (when we already knew that it was fall and slippery, finding out that it rained only budged the probability of the sprinkler by about five percentage points). Eventually, the logic starts to become intuitive.
Build your own networks and play around with them!
Nice.
I still want somebody to write a full tutorial on UnBBayes for LW.
That’s where I originally found the program. :-)
If you link to that post, you should also update it to mention that I’ve already written some of those. (#3, #9, #10)
Updated!
There is also the more mature GenIe http://genie.sis.pitt.edu/ with extensive documentation here http://genie.sis.pitt.edu/wiki/GeNIe_Documentation (more mature documentation with more functionality)
=====================
ETA. Oops, forgot to add the above is windows-only. For “other” there is SamIAm http://reasoning.cs.ucla.edu/samiam/, with documentation/tutorials here http://reasoning.cs.ucla.edu/samiam/help/
test