In a further article I will exhibit time series for three variables, A, B, and C, where the joint distribution is multivariate normal, the correlation of A with C is below −0.99, and each has zero correlation with B. …
And in the current comment section, I’m going to give away the answer, since I’ve run through the PCT demos. (Sorry, I don’t know how to format for spoilers, will edit once I figure out or someone tells me.)
You sure you didn’t want to figure out on your own? Okay, here goes. Kennaway is describing a feedback control system: a system that observes a variable’s current value and outputs a signal that attempts to bring it back towards a reference value. A is an external disturbance. B is the deviation of the system from the reference value (the error). C is the output of the controller.
The controller C will push in the opposite direction of the disturbance A, so A and C will be about anti-correlated. Their combined effect is to keep B very close to zero with random deviations, so B is uncorrelated with both.
The disturbance and the controller jointly cause the error. So, we have A->B and C->B. The error also causes the controller to output what it does, so B->C. (I assume directed cycles are allowed since there are four possible connections and you said there are 16 possible graphs.)
Together, that’s A-->B<-->C
(In other news, Kennaway or pjeby will suggest I’m not giving due attenction to Perceptual Control Theory.)
You have read my mind perfectly and understood the demos! But I’ll go ahead and make the post anyway, when I have time, because there are some general implications to draw from the disconnect between causality and correlation. Such as, for example, the impossibility of arriving at A-->B<-->C for this example from any existing algorithms for deriving causal structure from statistical information.
the impossibility of arriving at A-->B<-->C for this example from any existing algorithms for deriving causal structure from statistical information.
Correct me if I’m wrong, but I think I already know the insight behind what you’re going to say.
It’s this: there is no fully general way to detect all mutual information between variables, because that would be equivalent to being able to compute Kolmogorov complexity (minimum length to output a string), which would in turn be equivalent to solving the Halting problem.
And in the current comment section, I’m going to give away the answer, since I’ve run through the PCT demos. (Sorry, I don’t know how to format for spoilers, will edit once I figure out or someone tells me.)
You sure you didn’t want to figure out on your own? Okay, here goes. Kennaway is describing a feedback control system: a system that observes a variable’s current value and outputs a signal that attempts to bring it back towards a reference value. A is an external disturbance. B is the deviation of the system from the reference value (the error). C is the output of the controller.
The controller C will push in the opposite direction of the disturbance A, so A and C will be about anti-correlated. Their combined effect is to keep B very close to zero with random deviations, so B is uncorrelated with both.
The disturbance and the controller jointly cause the error. So, we have A->B and C->B. The error also causes the controller to output what it does, so B->C. (I assume directed cycles are allowed since there are four possible connections and you said there are 16 possible graphs.)
Together, that’s A-->B<-->C
(In other news, Kennaway or pjeby will suggest I’m not giving due attenction to Perceptual Control Theory.)
(Edit: some goofs)
You have read my mind perfectly and understood the demos! But I’ll go ahead and make the post anyway, when I have time, because there are some general implications to draw from the disconnect between causality and correlation. Such as, for example, the impossibility of arriving at A-->B<-->C for this example from any existing algorithms for deriving causal structure from statistical information.
Correct me if I’m wrong, but I think I already know the insight behind what you’re going to say.
It’s this: there is no fully general way to detect all mutual information between variables, because that would be equivalent to being able to compute Kolmogorov complexity (minimum length to output a string), which would in turn be equivalent to solving the Halting problem.
You’re wrong. :-)
Kolmogorov complexity will play no part in the exposition.
Check my comment: I was only guessing the underlying insight behind your future post, not its content.
I obviously leave room for the possibility that you’ll present a more limited or more poorly-defended version of what I just stated. ;-)