Sure. In a lot of cases where there seems to be a cyclic dependency, it’s actually an acyclic dependency when unrolled over time. So X1 causes Y1 causes Z1 causes X2 (number denotes timestep).
The simplest model of such type is simply where X1 directly causes X2 which directly causes X3 and so on, which is called a Markov chain, and is a type of Bayesian network.
it’s actually an acyclic dependency when unrolled over time
Yes; by introducing time steps and indexing by “at time tn” you can “always” do away with loops by kind of transforming them into spirals going down in time, from the loops they once were. But I wonder if people don’t seem to mind that the endless spiral still resembles a loop, that “X at time t1” and “X at time t2″ may technically be different nodes, but only in a trivial and uninteresting sense?
You can work DAG magic on such a “linearized” causal structure, I just wonder whether it’s actually cutting reality at its joints, and whether there are (commonly used) alternatives.
You should definitely look at the book “Probabilistic Graphical Models” by Daphne Koller, these concepts and more are explained in depth and with numerous examples. I kind of consider it the spiritual successor to Jayne’s work.
The reason why having a DAG is important is because it makes inference much easier. There are very efficient and exact algorithms for doing inference over DAGs, whereas inference over general graphs can be extremely hard (often taking exponential time). Once you roll out a loop in this fashion, eventually you reach a point where you can assume that X1 has very little influence over, say, X100, due to mixing. http://en.wikipedia.org/wiki/Markov_chain#Ergodicity Thus the network can be ‘grounded’ at X1, breaking the cycle.
If you can convert a cyclic dependency to a hundred acyclic dependencies, it makes sense to do so, and not just because of computational concerns.
Often real-world events really do unroll over time and we have to have some way of modelling time dependencies. Hidden Markov models and the Viterbi algorithm are a good example of this in action.
Sure. In a lot of cases where there seems to be a cyclic dependency, it’s actually an acyclic dependency when unrolled over time. So X1 causes Y1 causes Z1 causes X2 (number denotes timestep).
The simplest model of such type is simply where X1 directly causes X2 which directly causes X3 and so on, which is called a Markov chain, and is a type of Bayesian network.
Yes; by introducing time steps and indexing by “at time tn” you can “always” do away with loops by kind of transforming them into spirals going down in time, from the loops they once were. But I wonder if people don’t seem to mind that the endless spiral still resembles a loop, that “X at time t1” and “X at time t2″ may technically be different nodes, but only in a trivial and uninteresting sense?
You can work DAG magic on such a “linearized” causal structure, I just wonder whether it’s actually cutting reality at its joints, and whether there are (commonly used) alternatives.
You should definitely look at the book “Probabilistic Graphical Models” by Daphne Koller, these concepts and more are explained in depth and with numerous examples. I kind of consider it the spiritual successor to Jayne’s work.
The reason why having a DAG is important is because it makes inference much easier. There are very efficient and exact algorithms for doing inference over DAGs, whereas inference over general graphs can be extremely hard (often taking exponential time). Once you roll out a loop in this fashion, eventually you reach a point where you can assume that X1 has very little influence over, say, X100, due to mixing. http://en.wikipedia.org/wiki/Markov_chain#Ergodicity Thus the network can be ‘grounded’ at X1, breaking the cycle.
If you can convert a cyclic dependency to a hundred acyclic dependencies, it makes sense to do so, and not just because of computational concerns.
Often real-world events really do unroll over time and we have to have some way of modelling time dependencies. Hidden Markov models and the Viterbi algorithm are a good example of this in action.
I will (… put it on my “do() at some indeterminable time in the near to far future”-list. Sigh.). Thanks.
If you can wait, she may teach it again.