Predicting the ratio at t=20s is hopeless. The only sort of thing you can predict is the variance in the ratio over time, like the ratio as a function of time is μ(t)=0.5+ϵ , where ϵ∼N(0,σ2) . Here the large number of atoms lets you predict σ2 , but the exact number after 20 seconds is chaotic. To get an exact answer for how much initial perturbation still leads to a predictable state, you’d need to compute the lyapunov exponents of an interacting classical gas system, and I haven’t been able to find a paper that does this within 2 min of searching. (Note that if the atoms are non-interacting the problem stops being chaotic, of course, since they’re just bouncing around on the walls of the box)
They find Lyapunov exponent of about 1 or 2 (where time is basically in units of time it takes for a particle at average velocity to cover the length of the box).
For room temp gas, this timescale is about 1⁄400 seconds. So the divergence after 20 seconds should increase by a factor of over e^8000 (until it hits the cieling of maximum possible divergence).
Since an Angstrom is only 10^-10 m, if you start with an Angstrom offset, the divergence reaches maximum by about a tenth of a second.
Do you know how to interpret “maximum divergence” in this context? Also, IIRC aren’t there higher-order exponents that might decay slower? (I just read about this this morning, so I am quite unfamiliar with the literature here)
Do you know how to interpret “maximum divergence” in this context?
Hm, this is a good question.
In writing my original reply, I figured “maximum divergence” was a meter. You start with two trajectories an angstrom apart, and they slowly diverge, but they can’t diverge more than 1 meter.
I think this is true if you’re just looking at the atom that’s shifted, but not true if you look at all the other atoms as well. Then maybe we actually have a 10^24-dimensional state space, and we’ve perturbed the state space by 1 angstrom in 1 dimension, and “maximum divergence” is actually more like the size of state space (√12+12...1024 times=1012 meters).
In which case it actually takes two tenths of a second for exponential chaos to go from 10^-10 to 10^12.
Also, IIRC aren’t there higher-order exponents that might decay slower.
Nah, I don’t think that’s super relevant here. All the degrees of freedom of the gas are coupled to each other, so the biggest source of chaos can scramble everything just fine.
Nah, I don’t think that’s super relevant here. All the degrees of freedom of the gas are coupled to each other, so the biggest source of chaos can scramble everything just fine.
Hmm, I don’t super buy this. For example, this model predicts no standing wave would survive for multiple seconds, but this is trivial to disprove by experiment. So clearly there are degrees of freedom that remain coupled. No waves of substantial magnitude are present in the initialization here, but your argument clearly implies a decay rate for any kind of wave that is too substantial.
Yeah, good point (the examples, not necessarily any jargon-ful explanation of them). Sound waves, or even better, slow-moving vortices, or also better and different, diffusion of a cloud of one gas through a room filled with a different gas, show that you don’t get total mixing of a room on one-second timescale.
I think most likely, I’ve mangled something in the process of extrapolating a paper on a tiny toy model of a few hundred gas atoms to the meter scale.
The goal is not to predict the ratio, but to just predict which side will have more atoms (no matter how small the margin). It seems very likely to me that any such calculation would be extremely prohibitively expensive and would approximately require logical omniscience.
To clarify this, we are assuming that without random perturbation, you would get 100% accuracy in predicting which side of the system has more atoms at t=20s. The question is how much of that 100% accuracy you can recover with a very very small unknown perturbation.
Is this supposed to involve quantum physics, or just some purely classical toy model?
In a quantum physics model, the probability of observing more atoms on one side than the other will be indistinguishable from 50% (assuming that your box is divided exactly in half and all other things are symmetric etc). The initial perturbation will make no difference to this.
Quantum physics. I don’t see why it would be indistinguishable from 50%.
Agree that there will be some decoherence. My guess is decoherence would mostly leave particle position at this scale intact, and if it becomes a huge factor, I would want the question to be settled on the basis being able to predict which side has higher irreducible uncertainty (i.e. which side had higher amplitude, if I am using that concept correctly).
Citing https://arxiv.org/abs/cond-mat/9403051: “Furthermore if a quantum system does possess this property (whatever it may be), then we might hope that the inherent uncertainties in quantum mechanics lead to a thermal distribution for the momentum of a single atom, even if we always start with exactly the same initial state, and make the measurement at exactly the same time.”
Then the author proceed to demonstrate that it is indeed the case. I guess it partially answers the question: quantum state thermalises and you’ll get classical thermal distribution of measurement results of at least some measurements even when measuring the system in the same quantum state.
The less initial uncertainty in energy the faster the system thermalises. That is to slow quantum thermalisation down you need to initialize the system with atoms in highly localized positions, but then you can’t know their exact velocities and can’t predict classical evolution.
Decoherence (or any other interpretation of QM) will definitely lead to a pretty uniform distribution over this sort of time scale. Just as in the classical case, the underlying dynamics is extremely unstable within the bounds of conservation laws, with the additional problem that the final state for any given perturbation is a distribution instead of a single measurement.
If there is any actual asymmetry in the setup (e.g. one side of the box was 0.001 K warmer than the other, or the volumes of each side were 10^-9 m^3 different), you will probably get a very lopsided distribution for an observation of which side has more molecules regardless of initial perturbation.
If the setup is actually perfectly symmetric though (which seems fitting with the other idealizations in the scenario), the resulting distribution of outcomes will be 50:50, essentially independent of the initial state within the parameters given.
Predicting the ratio at t=20s is hopeless. The only sort of thing you can predict is the variance in the ratio over time, like the ratio as a function of time is μ(t)=0.5+ϵ , where ϵ∼N(0,σ2) . Here the large number of atoms lets you predict σ2 , but the exact number after 20 seconds is chaotic. To get an exact answer for how much initial perturbation still leads to a predictable state, you’d need to compute the lyapunov exponents of an interacting classical gas system, and I haven’t been able to find a paper that does this within 2 min of searching. (Note that if the atoms are non-interacting the problem stops being chaotic, of course, since they’re just bouncing around on the walls of the box)
https://www.sciencedirect.com/science/article/abs/pii/S1674200121001279
They find Lyapunov exponent of about 1 or 2 (where time is basically in units of time it takes for a particle at average velocity to cover the length of the box).
For room temp gas, this timescale is about 1⁄400 seconds. So the divergence after 20 seconds should increase by a factor of over e^8000 (until it hits the cieling of maximum possible divergence).
Since an Angstrom is only 10^-10 m, if you start with an Angstrom offset, the divergence reaches maximum by about a tenth of a second.
Do you know how to interpret “maximum divergence” in this context? Also, IIRC aren’t there higher-order exponents that might decay slower? (I just read about this this morning, so I am quite unfamiliar with the literature here)
Hm, this is a good question.
In writing my original reply, I figured “maximum divergence” was a meter. You start with two trajectories an angstrom apart, and they slowly diverge, but they can’t diverge more than 1 meter.
I think this is true if you’re just looking at the atom that’s shifted, but not true if you look at all the other atoms as well. Then maybe we actually have a 10^24-dimensional state space, and we’ve perturbed the state space by 1 angstrom in 1 dimension, and “maximum divergence” is actually more like the size of state space (√12+12...1024 times=1012 meters).
In which case it actually takes two tenths of a second for exponential chaos to go from 10^-10 to 10^12.
Nah, I don’t think that’s super relevant here. All the degrees of freedom of the gas are coupled to each other, so the biggest source of chaos can scramble everything just fine.
Hmm, I don’t super buy this. For example, this model predicts no standing wave would survive for multiple seconds, but this is trivial to disprove by experiment. So clearly there are degrees of freedom that remain coupled. No waves of substantial magnitude are present in the initialization here, but your argument clearly implies a decay rate for any kind of wave that is too substantial.
Yeah, good point (the examples, not necessarily any jargon-ful explanation of them). Sound waves, or even better, slow-moving vortices, or also better and different, diffusion of a cloud of one gas through a room filled with a different gas, show that you don’t get total mixing of a room on one-second timescale.
I think most likely, I’ve mangled something in the process of extrapolating a paper on a tiny toy model of a few hundred gas atoms to the meter scale.
The goal is not to predict the ratio, but to just predict which side will have more atoms (no matter how small the margin). It seems very likely to me that any such calculation would be extremely prohibitively expensive and would approximately require logical omniscience.
To clarify this, we are assuming that without random perturbation, you would get 100% accuracy in predicting which side of the system has more atoms at t=20s. The question is how much of that 100% accuracy you can recover with a very very small unknown perturbation.
Is this supposed to involve quantum physics, or just some purely classical toy model?
In a quantum physics model, the probability of observing more atoms on one side than the other will be indistinguishable from 50% (assuming that your box is divided exactly in half and all other things are symmetric etc). The initial perturbation will make no difference to this.
Quantum physics. I don’t see why it would be indistinguishable from 50%.
Agree that there will be some decoherence. My guess is decoherence would mostly leave particle position at this scale intact, and if it becomes a huge factor, I would want the question to be settled on the basis being able to predict which side has higher irreducible uncertainty (i.e. which side had higher amplitude, if I am using that concept correctly).
Citing https://arxiv.org/abs/cond-mat/9403051: “Furthermore if a quantum system does possess this property (whatever it may be), then we might hope that the inherent uncertainties in quantum mechanics lead to a thermal distribution for the momentum of a single atom, even if we always start with exactly the same initial state, and make the measurement at exactly the same time.”
Then the author proceed to demonstrate that it is indeed the case. I guess it partially answers the question: quantum state thermalises and you’ll get classical thermal distribution of measurement results of at least some measurements even when measuring the system in the same quantum state.
The less initial uncertainty in energy the faster the system thermalises. That is to slow quantum thermalisation down you need to initialize the system with atoms in highly localized positions, but then you can’t know their exact velocities and can’t predict classical evolution.
Decoherence (or any other interpretation of QM) will definitely lead to a pretty uniform distribution over this sort of time scale. Just as in the classical case, the underlying dynamics is extremely unstable within the bounds of conservation laws, with the additional problem that the final state for any given perturbation is a distribution instead of a single measurement.
If there is any actual asymmetry in the setup (e.g. one side of the box was 0.001 K warmer than the other, or the volumes of each side were 10^-9 m^3 different), you will probably get a very lopsided distribution for an observation of which side has more molecules regardless of initial perturbation.
If the setup is actually perfectly symmetric though (which seems fitting with the other idealizations in the scenario), the resulting distribution of outcomes will be 50:50, essentially independent of the initial state within the parameters given.
That is the question is not about the real argon gas, but about a billiard ball model? It should be stated in the question.