Gary Drescher dissolves this old mystery in one chapter of “Good and Real”. Amazing. I must have read a dozen pop science books that discuss this problem, analyze some proposed solutions, and then leave it as a mystery. Drescher crushes it.
This may not fit in one posting, but it might well fit in a sequence of four or so.
Believe it or not, I actually started an article on this around “17 October 2009” (per the date stamp) and never finished it. (I actually had the more ambitious idea of summarizing every chapter in one article, but figured Chapter 3 would be enough.) Might as well post what I have (formatting and links don’t carry over; I’ve corrected the worst issues) …
Here I attempt to summarize the points laid out in Gary Drescher’s Good and Real: Demystifying Paradoxes from Physics to Ethics (discussed previously on Less Wrong), chapter 3, which explores the apparent flow of time and gives a reductionist account of it. To [...] What follows is a restating of the essential points and the arguments behind them in my own words, which I hope to make faithful to the text. It’s long, but a lot shorter than reading the chapter, a lot cheaper than buying the book, and a lot less subjuntively self-defeating than pirating it.
The focus of the chapter is to solve three interrelated paradoxes: If the laws of physics are time-symmetric:
1) Why does entropy increase in only one direction?
2) Why do we perceive a directional flow of time?
3) Why do we remember the past but not the future?
Starting from the first: why does entropy—the total disorder in the universe—increase asymmetrically? To answer, start with a simple case: the billiard ball simulation, where balls have a velocity and position and inelastically bounce off each other as per the standard equations predicated on the (time-symmetric) conservation of linear momentum. For a good example of entropy’s increase, let’s initialize it with a non-uniformity: there will be a few large, fast balls, and many small, slow balls.
What happens? Well, as time goes by, they bounce off each other, and the larger balls transfer their momentum to balls with less. We see the standard increase in entropy as time increases. So if you were to watch a video of the simulation in action, there would be telltale signs of which is the positive and which is the negative direction: in the positive direction, large balls would plow through groups of smaller balls, leaving a “wake” during which it increases their speeds. But if we watch it in reverse, going back to the start, entropy, of course, decreases: highly-ordered wakes spontaneously form before the large balls go into them.
Hence, the asymmetry: entropy increases in only one direction.
The mystery dissolves when you consider what happens when you continue to view the simulation backwards, and proceed through the initial time, onward to t= −1, −2, −3, … . You see the exact same thing happen going in the direction of negative time from t=0. So, we see our confusion: entropy does not increase in just the positive direction: it increases as you move away from zero, even if that direction isn’t positive.
So, we need to reframe our understanding: instead of thinking in terms of positive and negative time directions, we should think in terms of “pastward” and “futureward” directions. Pastward means in the direction of the initial state, and futureward means away from it. Both the sequences t= 1, 2, 3, … and t= −1, −2, −3, … go into the future. (Note the parallel here to the reframing of “up” and “down” once your model of the earth goes from flat to round: “down” no longer means a specific vector, but the vector from where you are to the center of the earth. So you change your model of “down” and “up” to “centerward” and “anticenterward” [my terms, not Drescher’s], respectively.)
Okay, that gets us a correct statement of the conditions under which entropy increases, but still doesn’t say why entropy increases in only the futureward direction. For that, we need to identify what the positive-time futureward direction and the negative-time futureward direction have in common. For one thing, the balls become correlated. Previously (pastwardly), knowing a ball’s state did not allow you to infer much about the other balls’ states, as the velocities were set independently of one another. But the accumulation of collisions causes the balls to become correlated—in effect, to share information with each other. [Rephrase to discuss elimination of gradients/exchange of information of all parts of system?...]
Note that the entropy does not need to increase uniformly: this model still permits local islands of lower entropy in the futureward direction, as long as the total entropy still increases. Consider the “wakes” left by the large balls that were mentioned above. In that case, the large balls will “plow” right through the small balls and leave a (low entropy) wake. (Even as they do this, the large balls transfer momentum to the smaller balls and increase total entropy.) The wakes allow you to identify time’s direction: a wake is always located where the large ball was in an immediately pastward state. This relationship also implies that wake contains a “record” of sorts, giving physical form to the information in the current timewise state, regarding a pastward state.
This process is similar to what goes on in the brain. Just as wakes are islands of low entropy containing information about pastward states, so too is your brain an island of low entropy containing information about pastward states. (Life forms are already known to be dissipative systems that maintain an island of low entropy at the cost of a counterbalancing increase elsewhere.) [...]
So it’s not that “gee, we notice time goes forward, and we notice that entropy happens to always increase”. Rather, the increase of entropy determines what we will identify as the future, since any time slice will only contain versions of ourselves with memories of pastward states.
Hawking did this analysis in the first edition of A Brief History of Time—though he made a complete mess of it—and concluded that time will start going backwards when the universe stops expanding!
I remember back when I read this at university, I thought: Boltzman will be turning in his grave. I also remember immodestly thinking: here’s a smart, famous scientist—and even spotty teenage I could see what a ridiculous argument he was making - in about two seconds.
When I re-read A Brief History of Time in college, I remember bemusedly noticing that Hawking’s argument would be stronger if you reversed its conclusion.
A note to myself from 2009 claims that Hawking later dropped that argument. Can anyone substantiate that?
Yes, I actually read a large portion of that book (“From Eternity to Here”?) whilst still in the bookstore. It provided great exposition of several difficult concepts, but ultimately I was unimpressed, since Carroll would frequently present a problem in thermodynamics, and I would be thinking, “Yeah, so what about the Barbour/Drescher solution to this?” and he wouldn’t address it or say anything that would undermine it.
Cool. Except that one or the other of us didn’t quite understand Drescher. Because my understanding was that he considered and rejected the idea that the arrow of perceived time is the same as the order of increased entropy. I thought he said that it is the inter-particle correlations that matter for subjective time—not entropy as such. But perhaps I misunderstood.
I’m glad you bring this up, I’ve been interested in a discussion on this.
Drescher makes extensive use of the generalized concept of a “wake”: in the ball case, a wake is where you can identify which direction is “pastward”, i.e., to the direction of minimal inter-particle entanglement. Any mechanism that allows such an identification can be though of as a generalization of the “wake” that happens in the setup.
One such wake is the formation of memories (including memories in a brain), which, like the literal wake, exploit regularities of the environment to “know” the pastward direction, and (also like the wake) necessarily involve localized decrease but global increase of entropy. (edit: original was reversed)
So yes, I agree that Drescher is saying that the interparticle correlations are what determine the subjective feeling of time—but he’s also saying that the subjective feeling (memory formation) necessarily involves a local decrease of entropy and counterbalancing increase somewhere else.
I’m glad you bring this up, I’ve been interested in a discussion on this.
Unfortunately, I’m probably not the ideal person to carry out this discussion with you. I got my copy of the book through interlibrary-loan and it is due back tomorrow. :-(
The Arrow of Time
Gary Drescher dissolves this old mystery in one chapter of “Good and Real”. Amazing. I must have read a dozen pop science books that discuss this problem, analyze some proposed solutions, and then leave it as a mystery. Drescher crushes it.
This may not fit in one posting, but it might well fit in a sequence of four or so.
Believe it or not, I actually started an article on this around “17 October 2009” (per the date stamp) and never finished it. (I actually had the more ambitious idea of summarizing every chapter in one article, but figured Chapter 3 would be enough.) Might as well post what I have (formatting and links don’t carry over; I’ve corrected the worst issues) …
Here I attempt to summarize the points laid out in Gary Drescher’s Good and Real: Demystifying Paradoxes from Physics to Ethics (discussed previously on Less Wrong), chapter 3, which explores the apparent flow of time and gives a reductionist account of it. To [...] What follows is a restating of the essential points and the arguments behind them in my own words, which I hope to make faithful to the text. It’s long, but a lot shorter than reading the chapter, a lot cheaper than buying the book, and a lot less subjuntively self-defeating than pirating it.
The focus of the chapter is to solve three interrelated paradoxes: If the laws of physics are time-symmetric:
1) Why does entropy increase in only one direction?
2) Why do we perceive a directional flow of time?
3) Why do we remember the past but not the future?
Starting from the first: why does entropy—the total disorder in the universe—increase asymmetrically? To answer, start with a simple case: the billiard ball simulation, where balls have a velocity and position and inelastically bounce off each other as per the standard equations predicated on the (time-symmetric) conservation of linear momentum. For a good example of entropy’s increase, let’s initialize it with a non-uniformity: there will be a few large, fast balls, and many small, slow balls.
What happens? Well, as time goes by, they bounce off each other, and the larger balls transfer their momentum to balls with less. We see the standard increase in entropy as time increases. So if you were to watch a video of the simulation in action, there would be telltale signs of which is the positive and which is the negative direction: in the positive direction, large balls would plow through groups of smaller balls, leaving a “wake” during which it increases their speeds. But if we watch it in reverse, going back to the start, entropy, of course, decreases: highly-ordered wakes spontaneously form before the large balls go into them.
Hence, the asymmetry: entropy increases in only one direction.
The mystery dissolves when you consider what happens when you continue to view the simulation backwards, and proceed through the initial time, onward to t= −1, −2, −3, … . You see the exact same thing happen going in the direction of negative time from t=0. So, we see our confusion: entropy does not increase in just the positive direction: it increases as you move away from zero, even if that direction isn’t positive.
So, we need to reframe our understanding: instead of thinking in terms of positive and negative time directions, we should think in terms of “pastward” and “futureward” directions. Pastward means in the direction of the initial state, and futureward means away from it. Both the sequences t= 1, 2, 3, … and t= −1, −2, −3, … go into the future. (Note the parallel here to the reframing of “up” and “down” once your model of the earth goes from flat to round: “down” no longer means a specific vector, but the vector from where you are to the center of the earth. So you change your model of “down” and “up” to “centerward” and “anticenterward” [my terms, not Drescher’s], respectively.)
Okay, that gets us a correct statement of the conditions under which entropy increases, but still doesn’t say why entropy increases in only the futureward direction. For that, we need to identify what the positive-time futureward direction and the negative-time futureward direction have in common. For one thing, the balls become correlated. Previously (pastwardly), knowing a ball’s state did not allow you to infer much about the other balls’ states, as the velocities were set independently of one another. But the accumulation of collisions causes the balls to become correlated—in effect, to share information with each other. [Rephrase to discuss elimination of gradients/exchange of information of all parts of system?...]
Note that the entropy does not need to increase uniformly: this model still permits local islands of lower entropy in the futureward direction, as long as the total entropy still increases. Consider the “wakes” left by the large balls that were mentioned above. In that case, the large balls will “plow” right through the small balls and leave a (low entropy) wake. (Even as they do this, the large balls transfer momentum to the smaller balls and increase total entropy.) The wakes allow you to identify time’s direction: a wake is always located where the large ball was in an immediately pastward state. This relationship also implies that wake contains a “record” of sorts, giving physical form to the information in the current timewise state, regarding a pastward state.
This process is similar to what goes on in the brain. Just as wakes are islands of low entropy containing information about pastward states, so too is your brain an island of low entropy containing information about pastward states. (Life forms are already known to be dissipative systems that maintain an island of low entropy at the cost of a counterbalancing increase elsewhere.) [...]
So it’s not that “gee, we notice time goes forward, and we notice that entropy happens to always increase”. Rather, the increase of entropy determines what we will identify as the future, since any time slice will only contain versions of ourselves with memories of pastward states.
Hawking did this analysis in the first edition of A Brief History of Time—though he made a complete mess of it—and concluded that time will start going backwards when the universe stops expanding!
I remember back when I read this at university, I thought: Boltzman will be turning in his grave. I also remember immodestly thinking: here’s a smart, famous scientist—and even spotty teenage I could see what a ridiculous argument he was making - in about two seconds.
When I re-read A Brief History of Time in college, I remember bemusedly noticing that Hawking’s argument would be stronger if you reversed its conclusion.
A note to myself from 2009 claims that Hawking later dropped that argument. Can anyone substantiate that?
He has also edited A Brief History of Time to remove the howler. See page 64 for the updated text.
BTW, Sean Carroll just wrote an entire popular-level book on this subject.
Yes, I actually read a large portion of that book (“From Eternity to Here”?) whilst still in the bookstore. It provided great exposition of several difficult concepts, but ultimately I was unimpressed, since Carroll would frequently present a problem in thermodynamics, and I would be thinking, “Yeah, so what about the Barbour/Drescher solution to this?” and he wouldn’t address it or say anything that would undermine it.
Cool. Except that one or the other of us didn’t quite understand Drescher. Because my understanding was that he considered and rejected the idea that the arrow of perceived time is the same as the order of increased entropy. I thought he said that it is the inter-particle correlations that matter for subjective time—not entropy as such. But perhaps I misunderstood.
I’m glad you bring this up, I’ve been interested in a discussion on this.
Drescher makes extensive use of the generalized concept of a “wake”: in the ball case, a wake is where you can identify which direction is “pastward”, i.e., to the direction of minimal inter-particle entanglement. Any mechanism that allows such an identification can be though of as a generalization of the “wake” that happens in the setup.
One such wake is the formation of memories (including memories in a brain), which, like the literal wake, exploit regularities of the environment to “know” the pastward direction, and (also like the wake) necessarily involve localized decrease but global increase of entropy. (edit: original was reversed)
So yes, I agree that Drescher is saying that the interparticle correlations are what determine the subjective feeling of time—but he’s also saying that the subjective feeling (memory formation) necessarily involves a local decrease of entropy and counterbalancing increase somewhere else.
Unfortunately, I’m probably not the ideal person to carry out this discussion with you. I got my copy of the book through interlibrary-loan and it is due back tomorrow. :-(