I was thinking this as well, but you could construct a situation that doesn’t have this problem—like a mechanical system that relies on the derivative to perform some action deterministically.
That’s actually an interesting issue in control systems. IIRC, if you set up a system so that some variable B is a function of the time-derivative of A, B=f( dA(t)/dt ), and it requires you to know dA(T)/dt to compute B(T), such a system is called “acausal”. I believe this is because you can’t know dA(T)/dt until you know A(t) after time T.
So any physically-realizable system that depends on the time-derivative of some other value, is actually depending on the time-derivative at a previous point in time.
In contrast, there is no such problem for the integral. If I only know the time series of A(t) up to time T, then I know the integral of A up to time T, and such a relationship is not acausal.
In the general case, for a relationship between two systems where B is a function of A, the transfer function from A to B, num(s)/den(s) must be such that the deg(num) ⇐ deg(den), where deg() denotes the degree of a polynomial.
(The transfer function is ratio of B to A in the Laplace domain, usually given the variable s to replace t. Multiplying by s in the Laplace domain corresponds to differentiation in the time domain, and dividing by s is integration.)
(edit to clarify, then again to clarify some more)
You mean, like the mechanical (well, electronic) one I described?
B = dA/dt doesn’t imply that B is the cause of A. As I pointed out, a current generator attached to a capacitor causes the voltage, the reverse of the first example, but the mathematical relation between voltage and current is the same.
“Cause” is an everyday concept that tends to dissolve when looked at too closely. The research on causal analysis of statistical data quite sensibly does not try to define it.
“Cause” is an everyday concept that tends to dissolve when looked at too closely. The research on causal analysis of statistical data quite sensibly does not try to define it.
Ah, ok. Applying his definition (“variable X is a probabilistic-cause of variable Y if P(y|do(x)) != P(y) for some values x and y”) to the signal generator, it says that the voltage causes the current; in the current-source version, that the current causes the voltage. That’s exactly what I would say as well.
Of course, his limitation to acyclic relations excludes from his analysis systems that are only slightly more complicated, such as B=AC,C=−k∫B.
That’s what dynamic Bayesian networks are for. The current values of state variables of a system near stable equilibrium are not caused by each other; they are caused by past values. Dynamic Bayesian networks express this distinction with edges that pass forward in time.
excludes from his analysis systems that are only slightly more complicated, such as B=AC,C=−k∫B
The continuous-time limit of a dynamic Bayesian network can be a differential equation such as this.
(ETA) A dynamic Bayesian network is syntactic sugar for an ordinary Bayesian network that has the same structure in each of a series of time slices, with edges from nodes in each time slice to nodes in the next time slice. The Bayesian network that is made by unrolling a dynamic Bayesian network is still completely acyclic. Therefore, Bayesian networks have at least the representation power of finitely iterated systems of explicit recurrence relations and are acyclic, and continuum limits of Bayesian networks have at least the representation power of systems of differential equations and are acyclic. (Some representation powers that these Bayesian networks do not have are the representation powers of systems of implicit recurrence relations, systems of differential algebraic equations without index reduction, and differential games. Something like hybrid Bayesian-Markovian networks would have some of these representation powers, but they would have unphysical semantics (if physics is causal) and would be hard to use safely.)
I was thinking this as well, but you could construct a situation that doesn’t have this problem—like a mechanical system that relies on the derivative to perform some action deterministically.
That’s actually an interesting issue in control systems. IIRC, if you set up a system so that some variable B is a function of the time-derivative of A, B=f( dA(t)/dt ), and it requires you to know dA(T)/dt to compute B(T), such a system is called “acausal”. I believe this is because you can’t know dA(T)/dt until you know A(t) after time T.
So any physically-realizable system that depends on the time-derivative of some other value, is actually depending on the time-derivative at a previous point in time.
In contrast, there is no such problem for the integral. If I only know the time series of A(t) up to time T, then I know the integral of A up to time T, and such a relationship is not acausal.
In the general case, for a relationship between two systems where B is a function of A, the transfer function from A to B, num(s)/den(s) must be such that the deg(num) ⇐ deg(den), where deg() denotes the degree of a polynomial.
(The transfer function is ratio of B to A in the Laplace domain, usually given the variable s to replace t. Multiplying by s in the Laplace domain corresponds to differentiation in the time domain, and dividing by s is integration.)
(edit to clarify, then again to clarify some more)
You mean, like the mechanical (well, electronic) one I described?
B = dA/dt doesn’t imply that B is the cause of A. As I pointed out, a current generator attached to a capacitor causes the voltage, the reverse of the first example, but the mathematical relation between voltage and current is the same.
“Cause” is an everyday concept that tends to dissolve when looked at too closely. The research on causal analysis of statistical data quite sensibly does not try to define it.
Except for everyone following Pearl.
Ah, ok. Applying his definition (“variable X is a probabilistic-cause of variable Y if P(y|do(x)) != P(y) for some values x and y”) to the signal generator, it says that the voltage causes the current; in the current-source version, that the current causes the voltage. That’s exactly what I would say as well.
Of course, his limitation to acyclic relations excludes from his analysis systems that are only slightly more complicated, such as B=AC,C=−k∫B.
That’s what dynamic Bayesian networks are for. The current values of state variables of a system near stable equilibrium are not caused by each other; they are caused by past values. Dynamic Bayesian networks express this distinction with edges that pass forward in time.
The continuous-time limit of a dynamic Bayesian network can be a differential equation such as this.
(ETA) A dynamic Bayesian network is syntactic sugar for an ordinary Bayesian network that has the same structure in each of a series of time slices, with edges from nodes in each time slice to nodes in the next time slice. The Bayesian network that is made by unrolling a dynamic Bayesian network is still completely acyclic. Therefore, Bayesian networks have at least the representation power of finitely iterated systems of explicit recurrence relations and are acyclic, and continuum limits of Bayesian networks have at least the representation power of systems of differential equations and are acyclic. (Some representation powers that these Bayesian networks do not have are the representation powers of systems of implicit recurrence relations, systems of differential algebraic equations without index reduction, and differential games. Something like hybrid Bayesian-Markovian networks would have some of these representation powers, but they would have unphysical semantics (if physics is causal) and would be hard to use safely.)
(Dynamic Bayesian networks at the University of Michigan Chemical Engineering Process Dynamics and Controls Open Textbook (“ControlWiki”))