Yes, it is the relevant quantity in the limit of infinite number of uses of the channel. If you can use it just one time, it does not tell you much.
ForensicOceanography
Election results in Central Europe match some pre-WW1 borders
Actually the mutual information has some well-defined operational meaning. For example, the maximum rate at which we can transmit a signal through a noisy channel is given by the mutual information between the input and the output of the channel. So it depends on which task you are interested in.
Imagine a water wheel. The direction the river flows in controls the direction that the wheel turns. The amount of water in the wheel doesn’t change.
In this case you do not say “the wheel rotates in the direction of water increase”, but “the wheel rotates in the direction of water flow”.
I can see how you could argue that “the consciousness perceives past and future according to the direction of time in which it radiated heath”. But, if you mean that heath flow (or some other entropic-related phenomenon) is the explaination for our time perception (just like the water flow explains the wheel, or the DC tension explains the current in a circuit), this seems to me a bold and extraordinary claim, that would need a lot more evidence, both theoretical and experimental.
This is technically true of the universe as a whole. Suppose you take a quantum hard drive filled with 0′s, and fill it with bits in an equal superposition of 0 and 1 by applying a Hadamard gate. You can take those bits and apply the gate again to get the 0′s back. Entropy has not yet increased. Now print those bits. The universe branches into 2^bit count quantum branches. The entropy of the whole structure hasn’t increased, but the entropy of a typical individual branch is higher than that of the whole structure.
Yes, whenever you pinch a density matrix, its entropy increases. It depends on your philosophical stance on measurement and decoherence whether the superposition could be retrieved.
In general, I am more on the skeptical side about the links between abstract information and thermodynamics (see for instance https://arxiv.org/abs/1905.11057). It is my job, so I can not be entirely skeptic. But there is a lot of work to do before we can claim to have derived thermodynamics from quantum principles (at the state of the art, there is not even a consensus among the experts about what the appropriate definitions of work and heath should be for a quantum system).
Anyway, does the brain actually check whether it can uncompute something? How is this related with the direction in which we perceive the past? The future can (in principle) be computed, and the past can not be uncomputed; yet we know about the past and not about the future: is this that obvious?
[...] The universe as a whole behaves kind of like a reversible circuit.
This is another strong statement. Maybe in the XVIII century you would have said that the universe is a giant clock (mechanical philosophy), and in the XIX century you would have said that the brain is basically a big telephone switchboard.
I am not saying that it is wrong. Every new technology can provide useful insights about nature. But I think we should beware not to take these analogies too far.
I agree—but, if understood correctly the OP, he is averaging over a time scale much larger than the time required to reach the equilibrium.
It is interesting to think that dogs may have been selected for hundreds of generation for their ability to influence the emotions of humans.
While of course it could, current measurements suggest that it is not.
The human brain radiates waste heat. It is not a closed system.
Sure, but are you saying that the human brain perceives the whole universe (including the heat it dissipated around) when deciding what to label as “past” or “future”? The entropy of the human body is approximately constant (as long as you are alive).
How do you make space for new memories? By forgetting old ones? Info can’t be destroyed. Your brain is taking in energy dense low entropy food and air, and radiating out the old memory as a pattern of waste heat. Maybe you were born with a big load of empty space that you slowly fill up. A big load of low entropy blank space can only be constructed from the negentropy source of food+air+cold.
Ok, we should clarify which entropy you are talking about. Since this is a post about thermodynamics, I assumed that we are talking about dS = dQ/T, that is the one entropy for which the second law was introduced. In that case, when your brain’s temperature goes down its entropy also goes down, no use splitting hair.
In the second paragraph, it seem instead that you are talking about an uncertainity measure, like Von Neumann entropy*. But the Von Neumann entropy of what? The brain is not a string or a probability distribution, so the VN entropy is ill-defined for a brain. But fine, let us suppose that we have agreed on an abstract model of the brain on which we can define a VN entropy (maybe its quantum density matrix, if one can define such a thing for a brain). Then:
in a closed system the fact that “Info can’t be destroyed [in a closed system]” means that the total “information” (in a sense, the total Von Neumann entropy) is a constant. It never increases nor decreases, so you can not use it to make a time arrow.
in an open system (like the brain) it can increase or decrease. When you sleep I guess that it should decrease, because in some sense the brain gets “tidied up”, many memories of the day are deleted, etc. But, again, when you go to bed your consciousness does not perceive the following morning as “past”.
*while the authenticity of the following quote is debated, it is worth to stress that they the thermodynamical entropy and the Shannon/Von Neumann entropy are indeed two entirely different things, while related in some specific contexts.
You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.
-J. Von Neumann? (allegedly to Shannon)
No. As I said in this comment, this can not be true, otherwise in the evening you would be able to mak prophecies about the following morning.
Your brain can not measure the entropy of the universe—and its own entropy is not monotone with time.
The energy constrains the moments, but not the positions. If there is infinite space, the phase space is also infinite, even at constant energy.
Take two balls which start both at x=0, one with velocity v(0) = 1 and the other with velocity v(0) = −1, in an infinite line. They will continue to go away forever, no recurrence.
The entropy of the brain is approximately constant. A bit higher when you have the flu, a bit lower in the morning (when your body temperature is smaller). If we perceived past and future according to the direction in which the entropy of our brain increases, I would remember the next day when going to bed.
- Mar 12, 2021, 7:34 AM; 1 point) 's comment on Unconvenient consequences of the logic behind the second law of thermodynamics by (
Depending on how you define it, arguably there are observation of entropy decreases as small scales (if you are willing to define the “entropy” for a system made of two atoms, for example).
At macroscopic scale (10^23 molecules), it is as unlikely as a miracle.
If you work under the hypothesis that information is preserved, then the total entropy of the universe does not increase nor decrease.
In the equilibrium, small increases and small decreases should be equally likely, with an unimaginably low probability of high decreases (which becomes 0 if the universe is infinite).
First, a little technical precisation: Poincaré′s recurrence theorem applies to classical systems whose phase space is finite. So, if you believe that the universe is finite, then you have recurrence; otherwise no recurrence.
I think that your conclusion is correct under the hypothesis that the universe exists from an infinite time, and that our current situation of low entropy is the result of a random fluctuation.
The symmetry is broken by the initial condition. If at t=0 the entropy is very low, then it is almost sure that it will rise. The expert consensus is that there has been a special event (the big bang), that you can modelize as an initial condition of extremely high entropy.
It seems unlikely that the big bang was the result of a fluctuation from a previously unordered universe: you can estimate the probability of a random fluctuation resulting in the Earth’s existence. If I recall correctly Penrose did it explicitly, but you do not need the math to understand that (as already pointed out) “a solar system appears from thermal equilibrium” is extremely more likely that “an entire universe appears from thermal equilibrium”. Therefore, under you hypothesis, we should have expected with probability of almost 1, to be in the only solar system in existence in the observable universe.
I wish to stress that these are pleasant philosophical speculations, but I do not wish to assign an high confidence to anything said here: even if they work well for our purposes, I feel a bit nervous in extrapolating mathematical models it to the entire universe.
They are not things you would like to spend more time on, when you’re rich enough not to work.
People who work for a living aren’t kept from alcoholism because they don’t have time to drink, or can’t afford even cheap alcohol.
This is sure; but someone could be kept from alcoholism because he knows he must be sober to live. This comment suggests that some very rich people who lacks this motivation do effectively become drink-addicted.
The final paragraph of this comment seems relevant: https://astralcodexten.substack.com/p/book-review-fussell-on-class#comment-1350041
My position is that everyone is already sick and intoxicated at work and we don’t notice or care most of the time.
I do not think this is true. I think that it is important that we clarify this point before continuing.
I’m willing to bet your town’s average wages are terrible in comparison to what you get doing nothing.
No, it is not that the wages are low, it is that they can not be fired (both for legal and for cultural reasons). So they do not risk to lose their wage by not working.
To clarify your position, are you saying that if more people were sick/intoxicated then the quality of their work would deteriorate, but this does not really matter because there is sufficient slack in the system and nothing really bad would happen?
The fact that the Enterprise has survived for a long time may be due to the fact that captain Kirk overrules Spock in the areas where he is not competent (for example, when he estimates the probability of escaping from a black hole), while he is good enough in other aspects of his job.
The fact that Captain Kirk decides to ovverrule Spock’s 99,999999 % predictions is strong evidence that he does not trust them.