The usual materialist story of life I’ve heard is that life acts like an entropy pump, creating local reductions of entropy within the organism but increasing the entropy outside of the organism. (I think I’ve even seen that in The Sequences somewhere? But couldn’t find it, feel encouraged to link it.) But I’ve come to think that might actually be wrong and life might increase entropy both inside and outside the organism.
Here’s a rough account:
We ought to expect entropy to increase, so a priori life is much more feasible if it increases entropy rather than decreasing entropy.
Living matter is built mainly out of carbon and hydrogen, which is extracted from CO2 and H2O, leaving O2 as a result. Entropy breakdown:
The O2 left over from breaking up CO2 ought to have somewhat lower entropy than the original CO2.
The O2 left over from breaking up the original H2O ought to have… higher entropy because it’s a gas now?
The hydrocarbons don’t have much entropy because they stick together into big chunks that therefore heavily constrain their DOFs, but they do have some entropy for various reasons, and they are much more tightly packed than air, so per volume they oughta have orders of magnitude more entropy density. (Claude estimates around 200x.)
Organic matter also traps a lot of water which has a high entropy density.
Usually you don’t talk about entropy density rather than absolute entropy, but it’s unclear to me what it means for organisms to “locally” increase/decrease entropy if not by density.
Oxygen + hydrocarbons = lots of free energy, while water + carbon dioxide = not so much free energy. We usually associate free energy with low entropy, but that’s relative to the burned state where the free energy has been released into thermal energy. In this case, we should instead think relative to an unlit state where the energy hasn’t been collected at all. Less energy generally correlates to lower entropy.
Am I missing something?
I have no idea whether I can post anything here yet but I’ll try to answer: all other arguments other than the one from basic chemistry are irrelevant. organic life may be easily construed as a controlled combustion of organic fuel, one drawn over a very long span as opposed to happening all at once like an actual fire, and that is a chemical reaction that proceeds in the direction of increasing entropy: all organic foodstuffs are in the condensed phase (liquids and solids) and their combustion processes are gases. Ergo, organic life increases entropy over all. Simple, right? ~Chara
I think one shouldn’t think of entropy as fundamentally preferred or fundamentally associated with a particular process. Note that it isn’t even a well-defined parameter unless you posit some macrostate information and define entropy as a property of a system + the information we have about it.
In particular, life can either increase or decrease appropriate local measurements of entropy. We can burn the hydrocarbons or decay the uranium to increase entropy or we can locally decrease entropy by changing reflectivity properties of earth’s atmosphere, etc.
The more fundamental statement, as jessicata explains, is that life uses engines. Engines are trying to locally produce energy that does work rather than just heat, i.e., that has lower entropy compared to what one would expect from a black body. This means that they have to use free energy, which corresponds tapping into aspects of the surrounding environment where entropy has not yet been maximized (i.e., which are fundamentally thermodynamic rather than thermostatic), and they also have to generate work which is not just heat (i.e., they can’t just locally maximize the entropy). Life on earth mostly does this by using the fact that solar radiation is much higher-frequency than black-body radiation associated to temperatures on Earth, thus contains free energy (that can be released by breaking it down).
Maybe I’ll add two addenda:
It’s easy to confuse entropy with free energy. Since energy is conserved, globally the two measure the same thing. But locally, the two decouple, and free energy is the more relevant parameter here. Living processes often need to use extra free energy to prevent the work they are interested in doing from getting converted into heat (e.g. when moving we’re constantly fighting friction); in this way we’re in some sense locally increasing free energy.
I think a reasonable (though imperfect) analogy here is with potential energy. Systems tend to reduce their potential energy, and thus you can make a story that, in order to avoid just melting into a puddle on the ground, life needs to constantly fight the tendency of gravitational potential energy to be converted to kinetic energy (and ultimately heat). And indeed, when we walk upright, fly, build skyscrapers, use hydro power, we’re slowing down or modifying the tendency of potential energy to become kinetic. But this is in no sense the fundamental or defining property of life, whether we’re looking globally at all matter or locally at living beings. We sometimes burrow into the earth, flatten mountains, etc. While life both (a), can use potential energy of other stuff to power its engines and (b), needs to at least somewhat fight the tendency of gravitational kinetic energy to turn it into a puddle of matter without any internal structure, this is just one of many physical stories about life and isn’t “the whole story”.
We do not expect increasing entropy a priori, because Second Law is true only in closed systems. Open systems in general case have arbitrary entropy production. Under some nice conditions, Prigogine’s theorem shows that in open systems entropy production is minimal. And the Earth, thanks to the Sun, is open system.
You analyze wrong components of life. The main low entropy components are membranes, active transport, excretory system, ionic gradients, constant acidity levels, etc. Oxygen is far down the list, because oxygen is actually a toxic waste from photosynthesis.
Entropy production is not the same as entropy, though. I think entropy production can be minimized by maximizing local entropy, since then there’s no more space for entropy? I.e. since most of the CO2 has been broken up into carbon, there’s not much more photosynthesis that can be done.
They are all very dense, so they have high local entropy.
When I say “arbitrary” I mean “including negative values”.
I think your notion of life as decreasing entropy density is clearly wrong, because black holes are maxentropy objects, black hole volume is proportional to cube of mass, but entropy is additive, i.e., proportional to mass, so density of entropy is decreasing with growth of black hole and black holes are certainly not alive under any reasonable definition of life. Or, you can take black holes in very far future, where they consist the most of the matter, and increasing-entropy evolution of the universe results in black hole evaporation, which decreases density of entropy to almost-zero.
My notion wasn’t that life decreases entropy, my notion was that life increases entropy.
Black holes seem like a suboptimal hypothetical since we don’t really know what’s going on inside them. Their entropy especially seems paradoxical.
Under my model, density of entropy ought to increase with the growth of life.
I see. Though, what would that look like for Earth, using free energy to sort all the resources into separate bins? Which I suppose is something a utility maximizer might want. But are we really anywhere close to that? Maybe the theorem just doesn’t apply yet, since it’s only supposed to apply to a steady state.
If black hole entropy seems paradoxical to you, then I don’t think you’ve really understood the concept. It’s because we have no idea what’s going on inside a black hole that they are maxentropy objects. Every possible microstate (internal arrangement of matter and energy) corresponds to the same macrostate (mass, charge, angular momentum, linear momentum). The natural logarithm of the number of possible microstates that corresponds to the current macrostate, times Boltzman’s constant, is the entropy, and black hole necessarily maximize that.
Living things increase entropy in their environment, for sure, because they consume free energy in order to preserve the low-entropy aspects of their own internal structure. If this seems in contrast with what you’ve been told or taught about life to date, then I suspect you haven’t had very good teachers.
Over time, internal entropy of an organism will increase until it dies, because the self-preservation and repair mechanisms are not perfect. But at any given moment, if you killed the organism, it would decay until it reached equilibrium with its environment, and the entropy of what used to be its body would increase more and faster. This is because the living organism had been acting to keep its own entropy lower than it would be otherwise.
I understand what entropy is, but entropy is supposed to increase because the underlying dynamics are reversible, and so it’s paradoxical if black holes truly are identical, as that seems to imply the dynamics aren’t reversible.
What do you mean by “low-entropy aspects of their own internal structure”? Entropy is a scalar quantity. If e.g. the body temperature of an animal increases its entropy more than the cellular repetition etc. decreases its entropy, then the animal overall is increasing rather than decreasing entropy, and my point in the OP holds.
The underlying dynamics are reversible, if weirdly. Black holes have non-zero temperature and emit Hawking radiation, slowly evaporating in the process.
And yes, entropy is a scalar, and it’s well defined both an overall system and for each subset of that system. What keeps the entropy of a living thing lower than that of an undifferentiated soup of molecules? Structures that separate and organize those molecules. That’s what I mean by aspects.
Others have already noted that it seems like you’re asking a hard-to-parse set of nonstandard questions, so sorry for any misunderstandings on my part.
You’re right that internally, organisms control the production and flow of entropy, rather than the absolute entropy as such. So if you’re asking whether the entropy of the body is higher than it would be if it were cooled to a lower room temperature, then yes. But that means the answer to your question depends on whether the organism is currently in Phoenix in summer or Canada in winter, or whether it’s warm or cold blooded. I’m not sure this question is interesting in regards to how likely life was to evolve in the first place. I suspect not very.
Is spontaneous freezing of ice at low temperature a violation of the second law? No, because as described you wouldn’t be measuring the same system over time. You’d be measuring “Water” one one side and “Ice but not the heat given off” on the other. This also is why most of the rest of your bullet points are not really related to the second law as such. You can’t count the entropy of O2 evolved in photosynthesis while ignoring that of the C and the H and the absorbed photons and the emitted waste heat.
It means, how fast is the total entropy of the combined system increasing, and where is the entropy going?
If I mix together (dissolve) a pile of food small molecules (Water, sugars, amino acids, fatty acids, glycerol, nucleic acids, mineral salts), with the same elemental composition and temperature as my body, which has higher entropy? The former.
If I cool the pile but add enough extra sugar that, if burned, would heat it to body temperature, then what’s the answer? I’m not sure. But if I did the sugar burning to raise the temperature, consumed ambient O2, and emitted CO2 and H2O in the process, then the resulting system (hot solution + emitted gases) has even more entropy than the initially-warm pile.
Over the course of my life, my body built itself up out of exactly those kinds of components, creating a lower-entropy-than-a-solution-of-food-molecules body and a high entropy stream of waste gases, liquids, solids, and heat. This is the sense in which bodies are low-entropy.
The answer to the deep problem of the black hole information paradox you’ve mentioned is: Information does leak out of a black hole, albeit likely in encrypted form, and importantly black holes don’t destroy information, they preserve information.
We don’t know how it’s leaked or how it gets out, we only have speculations on the process, but we do know it gets out eventually.
I kinda think of ‘free energy’ and ‘entropy’ as being things that living creatures in some sense ‘consume’. We use the ‘order’ present in the universe to advance our goals (e.g. homeostasis) and leave behind a trail of higher entropy. We harness a gradient of incoming energy and order.
The sunlight which a plant absorbs might counterfactually have been turned into heat after being absorbed by the ground, and ended up in the same entropy state (from the perspective of the universe). Or it might have reflected, traveled light-years through space, and warmed some other thing. The leaf managed to insert itself in this process, intercepting the free energy, and more rapidly-than-counterfactually-expected increased the entropy of the universe.
And living multi-cellular beings are basically made up of tiny entities, cells, which are generally doing the metabolism process internally. And then mitochondria and chloroplasts within cells. But it would be a mistake to say that the living thing is causing itself to be disordered because it’s increasing entropy in parts of itself. It’s spending free energy (and ‘excreting’ entropy) in order to accomplish things. For instance, using a muscle (converting some of its stored energy to motion and waste heat) in order to bring food to the creature’s mouth. The creature is creating an anti-entropic state, pursuing its specific goals, by increasing the probability of the universe corresponding to its goal state, by causing other things to be extra entropic (always with some extra loss along the way, like from friction). You are missing the order that the agent is creating in the world though if you aren’t analyzing the world with the frame of how likely the agent’s goals were to be achieved by random chance (e.g. Brownian motion) versus by active optimization efforts by the agent. Anytime a living creature agentically does anything, they are consuming free energy and excreting entropy.
That’s my understanding anyway, but I may be using the physics terms wrong since I’m not a physicist.
The amount of entropy in a given organism stays about the same, though I guess you could argue it increases as the organism grows in size. Reason: The organism isn’t mutating over time to become made of increasingly high entropy stuff, nor is it heating up. The entropy has to stay within an upper and lower bound. So over time the organism will increase entropy external to itself, while the internal entropy doesn’t change very much, maybe just fluctuates within the bounds a bit.
It’s probably better to talk about entropy per unit mass, rather than entropy density. Though mass conservation isn’t an exact physical law, it’s approximately true for the kinds to stuff that usually happens on Earth. Whereas volume isn’t even approximately conserved. And in those terms, 1kg of gas should have more entropy than 1kg of condensed matter.
I mean, actually it is. Plus accumulation of various kinds of damage, experiences, etc. which makes it differ from other organisms.
Looking it up, apparently people drop very slightly in temperature when they age, which I guess might dominate the entropy considerations (though I guess that is due to slowly dying, so it also seems compatible with entropy being related to life if reduction in life is related to reduction in entropy).
Couldn’t it be reasonable to say that entropy increases as a sign of increased vitality associated with growing up to adulthood, and then afters has a mixture of an infinitesimal increasing effect from life experience and a moderate associated wirh vitality breakdown?
But if we go by unit mass, shouldn’t we count both the entropy in the air and the entropy in the organic matter, since they’re both related to the original mass that goes into life, meaning therefore life still increases entropy?
I think the correct unit is “per particle” or “per mole”.
Of atoms or of molecules?
If we go there, I guess the best unit is “per degree of freedom”.
I think the confusion may arise from this concept of ‘entropy density’?
To compare density levels, we need to look at a specific volume or amount of matter (recall the units of entropy are energy over temperature, no reference to space or mass). In this closed system the 2nd law tells us that overall entropy only goes up, but it does not help us to differentiate between different areas of density. It also tells us that, overall, the density will increase over time, which is not intuitive.
Considering open systems makes things easier. Energy and matter can flow in and out. You can still sample your ‘entropy density’ in defined volumes, and you may indeed find that ‘life has increased entropy locally’. But, through thermodynamic coupling, a greater amount of entropy has been exported to the environment. This coupling is the ‘life engines’ Dimitry and jessicata refer to.
In summary, the ‘entropy density’ concept needs to be considered carefully in local vs. global terms.
If you were to analyze which places life increases entropy and which places life decreases entropy, what would you decomposition look like?
I think you’re flogging a dead horse with this line of questioning. Or perhaps its the teleological language you choose to employ. There are a bunch of good replies. What is it you actually want to know? To get better answers you need to ask better questions.