Mathematician here. I wanted to agree with @pianoforte611 - just because you have infinite time doesn’t mean that every event will repeat over and over.
For those interested in some reading, the general question is basically the question of Transience in Markov Chains; I also have some examples. :)
Let us say that we have a particle moving along a line. In each unit of time, it moves a unit of distance either left or right, with probability 1⁄10 of the former and 9⁄10 of the latter. How often can we expect the particle to have returned to its starting point? Well, to return to the origin, we must have moved left and right an equal number of times. At odd times, this is impossible; at time 2n, the probability of this is
%5En%20\cdot%20\left(\frac9{10}\right)%5En%20\cdot%20\binom{2n}{n}) (this is not difficult to derive, and a simple explanation is given here). Summing this over all n, we get that the expected number of returns is one in four—in other words, we have no guarantee of returning even once, much less an infinite number times!
If this example strikes you as somewhat asymmetric, worry not—if the point was moving in three dimensions instead of one (so it could up, down, forward, or back as well as left or right), then a weighing of 1⁄6 to each direction means that you won’t return to the starting point infinitely often. If you don’t like having a fixed origin, use two particles, and have them moving independently in 3 dimensions. They will meet after time zero with less-than-unit-probability (actually, the same probability as in the previous problem, since the problems are equivalent after you apply a transformation).
What if we assume a finite universe instead? Contrary to what the post we’re discussing might suggest, this actually makes recurrence more reasonable. To show that every state of a finite universe recurs infinitely often, we only need to know one thing: that every state of the universe can be eventually reached from every other state.
Is this plausible? I’m not sure. The first objection that comes to mind is entropy: if entropy always increases, then we can never get back to where we started. But I seem to recall a claim that entropy is a statistical law: it’s not that it cannot decrease, but that it is extremely unlikely to do so. Extremely low probabilities do not frighten us here: if the universe is finite, then all such probabilities can be lower-bounded by some extremely tiny constant, which will eventually be defeated by infinite time.
But if the universe is infinite, this does not work: not even if the universe is merely potentially infinite, by which I mean that it can grow to an arbitrarily large finite size. This is already enough for the Markov chain in question to have infinitely many states, and my intuition tells me that in such a case it is almost certainly transient.
You are absolutely correct. If the number of states of the universe is finite, then as long as any state is reachable from any other state, then every state will be reached arbitrarily often if you wait long enough.
Mathematician here. I wanted to agree with @pianoforte611 - just because you have infinite time doesn’t mean that every event will repeat over and over.
For those interested in some reading, the general question is basically the question of Transience in Markov Chains; I also have some examples. :)
Let us say that we have a particle moving along a line. In each unit of time, it moves a unit of distance either left or right, with probability 1⁄10 of the former and 9⁄10 of the latter. How often can we expect the particle to have returned to its starting point? Well, to return to the origin, we must have moved left and right an equal number of times. At odd times, this is impossible; at time 2n, the probability of this is
%5En%20\cdot%20\left(\frac9{10}\right)%5En%20\cdot%20\binom{2n}{n}) (this is not difficult to derive, and a simple explanation is given here). Summing this over all n, we get that the expected number of returns is one in four—in other words, we have no guarantee of returning even once, much less an infinite number times!If this example strikes you as somewhat asymmetric, worry not—if the point was moving in three dimensions instead of one (so it could up, down, forward, or back as well as left or right), then a weighing of 1⁄6 to each direction means that you won’t return to the starting point infinitely often. If you don’t like having a fixed origin, use two particles, and have them moving independently in 3 dimensions. They will meet after time zero with less-than-unit-probability (actually, the same probability as in the previous problem, since the problems are equivalent after you apply a transformation).
I hope this helps!
What if we assume a finite universe instead? Contrary to what the post we’re discussing might suggest, this actually makes recurrence more reasonable. To show that every state of a finite universe recurs infinitely often, we only need to know one thing: that every state of the universe can be eventually reached from every other state.
Is this plausible? I’m not sure. The first objection that comes to mind is entropy: if entropy always increases, then we can never get back to where we started. But I seem to recall a claim that entropy is a statistical law: it’s not that it cannot decrease, but that it is extremely unlikely to do so. Extremely low probabilities do not frighten us here: if the universe is finite, then all such probabilities can be lower-bounded by some extremely tiny constant, which will eventually be defeated by infinite time.
But if the universe is infinite, this does not work: not even if the universe is merely potentially infinite, by which I mean that it can grow to an arbitrarily large finite size. This is already enough for the Markov chain in question to have infinitely many states, and my intuition tells me that in such a case it is almost certainly transient.
You are absolutely correct. If the number of states of the universe is finite, then as long as any state is reachable from any other state, then every state will be reached arbitrarily often if you wait long enough.