Our subjective experience of the arrow of time is occasionally suggested to be an essentially entropic phenomenon.
This sounds cool and deep but crashes headlong into the issue that the entropy rate and the excess entropy of any stochastic process is time-symmetric. I find it amusing that despite hearing this idea often from physicists and the like apparently this rather elementary fact has not prevented their storycrafting.
Luckily, computational mechanics provides us with a measure that is not time symmetric: the stochastic complexity of the epsilon machine C
For any stochastic process we may also consider the epsilon machine of the reverse process, in other words the machine that predicts the past based on the future. This can be a completely different machine whose reverse stochastic complexity Crev is not equal to C.
Some processes are easier to predict forward than backward. For example, there is considerable evidence that language is such a process. If the stochastic complexity and the reverse stochastic complexity differ we speak of a causally assymetric process.
Alec Boyd pointed out to me that the classic example of a glass falling of a table is naturally thought of in these terms. The forward process is easy to describe while the backward process is hard to describe where easy and hard are meant in the sense of stochastic complexity: bits needed to specify the states of perfect minimal predictor, respectively retrodictor.
rk. note that time assymmetry is a fundamentally stochastic phenomenon. THe underlyiing (let’s say classicially deterministic) laws are still time symmetric.
The hypothesis is then: many, most macroscopic processes of interest to humans, including other agents are fundamentally such causally assymetric (and cryptic) processes.
This sounds cool and deep but crashes headlong into the issue that the entropy rate and the excess entropy of any stochastic process is time-symmetric.
It’s time symmetric around a starting point t0 of low entropy. The further t is from t0, the more entropy you’ll have, in either direction. The absolute value |t−t0| is what matters.
In this case, t0 is usually taken to be the big bang. So the further in time you are from the big bang, the less the universe is like a dense uniform soup with little structure that needs description, and the higher your entropy will be. That’s how you get the subjective perception of temporal causality.
Presumably, this would hold to the other side of t0 as well, if there is one. But we can’t extrapolate past t0, because close to t0 everything gets really really energy dense, so we’d need to know how to do quantum gravity to calculate what the state on the other side might look like. So we can’t check that. And the notion of time as we’re discussing it here might break down at those energies anyway.
See also the Past Hypothesis. If we instead take a non-speculative starting point as t0, namely now, we could no longer trust our memories, including any evidence we believe to have about the entropy of the past being low, or about physical laws stating that entropy increases with distance from t0. David Albert therefore says doubting the Past Hypothesis would be “epistemically unstable”.
Crypticity, Reverse Epsilon Machines and the Arrow of Time?
[see https://arxiv.org/abs/0902.1209 ]
Our subjective experience of the arrow of time is occasionally suggested to be an essentially entropic phenomenon.
This sounds cool and deep but crashes headlong into the issue that the entropy rate and the excess entropy of any stochastic process is time-symmetric. I find it amusing that despite hearing this idea often from physicists and the like apparently this rather elementary fact has not prevented their storycrafting.
Luckily, computational mechanics provides us with a measure that is not time symmetric: the stochastic complexity of the epsilon machine C
For any stochastic process we may also consider the epsilon machine of the reverse process, in other words the machine that predicts the past based on the future. This can be a completely different machine whose reverse stochastic complexity Crev is not equal to C.
Some processes are easier to predict forward than backward. For example, there is considerable evidence that language is such a process. If the stochastic complexity and the reverse stochastic complexity differ we speak of a causally assymetric process.
Alec Boyd pointed out to me that the classic example of a glass falling of a table is naturally thought of in these terms. The forward process is easy to describe while the backward process is hard to describe where easy and hard are meant in the sense of stochastic complexity: bits needed to specify the states of perfect minimal predictor, respectively retrodictor.
rk. note that time assymmetry is a fundamentally stochastic phenomenon. THe underlyiing (let’s say classicially deterministic) laws are still time symmetric.
The hypothesis is then: many, most macroscopic processes of interest to humans, including other agents are fundamentally such causally assymetric (and cryptic) processes.
It’s time symmetric around a starting point t0 of low entropy. The further t is from t0, the more entropy you’ll have, in either direction. The absolute value |t−t0| is what matters.
In this case, t0 is usually taken to be the big bang. So the further in time you are from the big bang, the less the universe is like a dense uniform soup with little structure that needs description, and the higher your entropy will be. That’s how you get the subjective perception of temporal causality.
Presumably, this would hold to the other side of t0 as well, if there is one. But we can’t extrapolate past t0, because close to t0 everything gets really really energy dense, so we’d need to know how to do quantum gravity to calculate what the state on the other side might look like. So we can’t check that. And the notion of time as we’re discussing it here might break down at those energies anyway.
See also the Past Hypothesis. If we instead take a non-speculative starting point as t0, namely now, we could no longer trust our memories, including any evidence we believe to have about the entropy of the past being low, or about physical laws stating that entropy increases with distance from t0. David Albert therefore says doubting the Past Hypothesis would be “epistemically unstable”.