. . . Spirtes et al’s Causation, Prediction, and Search. One of the axioms used in the last-mentioned is the Faithfulness Axiom. See the book for the precise formulation; informally put it amounts to saying that if two variables are uncorrelated, then they are causally independent. . . . The purpose of this article is to argue that this is not the case.
Emphasis added here and below.
Causation, Prediction, and Search, page 31:
Faithfulness Condition: Let G be a causal graph and P a probability distribution generated by G. <G, P> satisfies the Faithfulness Condition if and only if every conditional independence relation true in P is entailed by the Causal Markov Condition applied to G.
If the variables are independent then the correlation is 0, but the converse is not true because the correlation coefficient detects only linear dependencies between two variables. Here is an example: Suppose the random variable X is uniformly distributed on the interval from −1 to 1, and Y = X². Then Y is completely determined by X, so that X and Y are dependent, but their correlation is zero; they are uncorrelated.
Spirtes’s example on page 71 looks like a linear Gaussian causal system. In a linear Gaussian causal system, uncorrelation is identical with simple marginal independence, and it can imply complete conditional independence.
Theorem: In the long run, a bounded, differentiable real function has zero correlation with its first derivative. . . .
Notice that unlike the case that Spirtes considers, where the causal connections between two variables just happen to have multiple effects that exactly cancel, the lack of correlation between A and B is robust.
Yes, I think this is true for values of a function and its derivative sampled at single uniformly random times (for a limit sense of “uniform”).
Emphasis added here and below.
Causation, Prediction, and Search, page 31:
Wikipedia on correlation:
Spirtes’s example on page 71 looks like a linear Gaussian causal system. In a linear Gaussian causal system, uncorrelation is identical with simple marginal independence, and it can imply complete conditional independence.
Yes, I think this is true for values of a function and its derivative sampled at single uniformly random times (for a limit sense of “uniform”).