A simple statement of fact: there is no
physics theory that explains the nature of,
or even the existence of, football matches,
teapots, or jumbo-jet aircraft. The human
mind is physically based, but there is no
hope whatever of predicting the behaviour
it controls from the underlying physical
laws. Even if we had a satisfactory fundamental
physics ‘theory of everything’,
this situation would remain unchanged:
physics would still fail to explain the outcomes
of human purpose, and so would
provide an incomplete description of the
real world around us.
[...]
the higher levels in the
hierarchy of complexity have autonomous
causal powers that are functionally independent
of lower-level processes. Topdown
causation takes place as well as
bottom-up action, with higher-level contexts
determining the outcome of lowerlevel
functioning, and even modifying the
nature of lower-level constituents.
Topdown causation takes place as well as bottom-up action, with higher-level contexts determining the outcome of lowerlevel functioning, and even modifying the nature of lower-level constituents.
I think this is a hugely unappreciated fact about the universe. Macroscopic variation can be insensitive to virtually all microscopic variation, in the sense that some small set macroscopic variables obeys some relation without special regard to the particular microstate in existence, e.g., PV=nRT. And yet, interactions that can be described entirely at the macroscopic level may end up causing huge changes to microscopic states.
E. T. Jaynes had an important insight about these sorts of things: if something macroscopic happens reproducibly in spite of no fine control over the microstate, then it must be the case that the process is insensitive to microscopic variation; (no duh, right? But --) therefore we will be able to make macroscopic predictions in spite of having no microstate knowledge just by picking the probability distribution over microstates that maximizes entropy subject to the constraints of our macroscopic knowledge.
Consider what this means for so-called “emergent properties”. If a system reproducibly displays some “emergent” property once enough constituent parts are aggregated and the aggregation process is largely or entirely uncontrolled, then we ought to be able to predict the emergence of that property by taking a maximum entropy distribution over the details of the aggregation process. (And if control of some aspect of the aggregation process is important, we can incorporate that fact as a constraint in the entropy maximization.)
And consciousness is sometimes said to be an emergent property of brain processes...
This is a classic “Microsoft help desk” answer: while technically correct, it doesn’t really help solve the problem. Predicting via entropy distribution for complex systems is hugely more complicated than other methods, and the only places it can really work are things like the ideal gas law and rubber bands. Put together a bunch of systems capable of exporting entropy to each other and interacting, and you’ll see the difficulty ramp up absurdly fast.
Since this is meta-level advice, there is no “the problem” in sight. Your criticism would seem to apply to cases not covered by my claim, to wit, cases where the phenomenological macrostate predictions are sharp even though the microstates are uncontrolled. If you’re saying the cases that are covered are rare, I do not deny it.
My philosophy/epistemology holds that the word “emergent” can be replaced with “reducible” with no loss of meaning. People try to sneak in anti-reductionist ideas with the concept of emergence, so what I usually do is replace the word as I’m reading and see if it still makes sense.
Yes. That is also my view. Except that “reducible” carries with it the view from below, often with the goal of explaining or deriving the macroscopic behavior (and often with the implied valuation that the macroscopic effects have no merit of their own) whereas “emergent” carries with it the view from above, where the interplay on the macroscopic level has merit and is of interest of its own.
I fully support using high-level models. After all, doing any sort of macroscopic work by modeling elementary particles is computationally intractable on a fundamental level.
The problem with emergence is that it’s used to sneak in nonreducible magic too often. I need the reminder that however interesting airplanes are, there’s nothing there that isn’t the result of elementary particles and fundamental fields.
That’s all ‘emergence’ is, really. Macroscopic behavior that is common among a huge variety of systems, possibly with widely differing microscopic details.
The cosmologist G.F.R. Ellis once wrote a one-page essay in Nature about it: http://www.mth.uct.ac.za/~ellis/nature.pdf
[...]
I think this is a hugely unappreciated fact about the universe. Macroscopic variation can be insensitive to virtually all microscopic variation, in the sense that some small set macroscopic variables obeys some relation without special regard to the particular microstate in existence, e.g., PV=nRT. And yet, interactions that can be described entirely at the macroscopic level may end up causing huge changes to microscopic states.
E. T. Jaynes had an important insight about these sorts of things: if something macroscopic happens reproducibly in spite of no fine control over the microstate, then it must be the case that the process is insensitive to microscopic variation; (no duh, right? But --) therefore we will be able to make macroscopic predictions in spite of having no microstate knowledge just by picking the probability distribution over microstates that maximizes entropy subject to the constraints of our macroscopic knowledge.
Consider what this means for so-called “emergent properties”. If a system reproducibly displays some “emergent” property once enough constituent parts are aggregated and the aggregation process is largely or entirely uncontrolled, then we ought to be able to predict the emergence of that property by taking a maximum entropy distribution over the details of the aggregation process. (And if control of some aspect of the aggregation process is important, we can incorporate that fact as a constraint in the entropy maximization.)
And consciousness is sometimes said to be an emergent property of brain processes...
This is a classic “Microsoft help desk” answer: while technically correct, it doesn’t really help solve the problem. Predicting via entropy distribution for complex systems is hugely more complicated than other methods, and the only places it can really work are things like the ideal gas law and rubber bands. Put together a bunch of systems capable of exporting entropy to each other and interacting, and you’ll see the difficulty ramp up absurdly fast.
Since this is meta-level advice, there is no “the problem” in sight. Your criticism would seem to apply to cases not covered by my claim, to wit, cases where the phenomenological macrostate predictions are sharp even though the microstates are uncontrolled. If you’re saying the cases that are covered are rare, I do not deny it.
My philosophy/epistemology holds that the word “emergent” can be replaced with “reducible” with no loss of meaning. People try to sneak in anti-reductionist ideas with the concept of emergence, so what I usually do is replace the word as I’m reading and see if it still makes sense.
Yes. That is also my view. Except that “reducible” carries with it the view from below, often with the goal of explaining or deriving the macroscopic behavior (and often with the implied valuation that the macroscopic effects have no merit of their own) whereas “emergent” carries with it the view from above, where the interplay on the macroscopic level has merit and is of interest of its own.
I fully support using high-level models. After all, doing any sort of macroscopic work by modeling elementary particles is computationally intractable on a fundamental level.
The problem with emergence is that it’s used to sneak in nonreducible magic too often. I need the reminder that however interesting airplanes are, there’s nothing there that isn’t the result of elementary particles and fundamental fields.
That’s all ‘emergence’ is, really. Macroscopic behavior that is common among a huge variety of systems, possibly with widely differing microscopic details.