ISTM that the actual present usage of “emergent” is actually pretty well-defined as a cluster, and it doesn’t include the ideal gas laws. I’m offering a candidate way to cash-out that usage without committing the Mind Projection Fallacy.
The fallacy here is thinking there’s a difference between the way the ideal gas laws emerge from particle physics, and the way intelligence emerges from neurons and neurotransmitters. I’ve only heard “emergent” used in the following way:
A system X has emergent behavior if we have heuristics for both a low-level description and a high-level description, and the high-level description is not easily predictable from the low-level description
For instance, gliders moving across the screen diagonally is emergent in Conway’s Life.
The “easily predictable” part is what makes emergence in the map, not the territory.
Yes. My point was that emergence isn’t about what we know how to derive from lower-level descriptions, it’s about what we can easily see and predict from lower-level descriptions. Like Roko, I want my definition of emergence to include the ideal gas laws (and I haven’t heard the word used to exclude them).
For what it’s worth, Cosma Shalizi’s notebook page on emergence has a very reasonable discussion of emergence, and he actually mentions macro-level properties of gas as a form of “weak” emergence:
The weakest sense [i]s also the most obvious. An emergent property is one which arises from the interaction of “lower-level” entities, none of which show it. No reductionism worth bothering with would be upset by this. The volume of a gas, or its pressure or temperature, even the number of molecules in the gas, are not properties of any individual molecule, though they depend on the properties of those individuals, and are entirely explicable from them; indeed, predictable well in advance.
To define emergence as it is normally used, he adds the criterion that “the new property could not be predicted from a knowledge of the lower-level properties,” which looks to be exactly the definition you’ve chosen here (sans map/territory terminology).
If we taboo “emergence” what do we think is going on with Langton’s Ant?
We have one description of the ant/grid system in Langton’s Ant: namely, the rules which totally govern the behavior of the system. We have another description of the system, however: the recurring “highway” pattern that apparently results from every initial configuration tested. These two descriptions seem to be connected, but we’re not entirely sure how (The only explanation we have is akin to this: Q: Why does every initial configuration eventually result in the highway pattern? A: The rules did it.) That is, we have a gap in our map.
Since the rules, which we understand fairly well, seem on some intuitive sense to be at a “lower level” of description than the pattern we observe, and since the pattern seems to depend on the “low-level” rules in some way we can’t describe, some people call this gap “emergence.”
I recall hearing, although I can’t find a link, that the Langton Ant problem has been solved recently. That is, someone has given a formal proof that every ant results in the highway pattern.
No, I want my definition of “emergent” to say that the ideal gas laws are emergent properties of molecules.
Why not just say
We say that a system X has emergent behavior if we have heuristics for both a low-level description and a high-level description
The high-level structure shouldn’t be the same as the low level structure, because I don’t want to say a pile of sand emerges from grains of sand.
ISTM that the actual present usage of “emergent” is actually pretty well-defined as a cluster, and it doesn’t include the ideal gas laws. I’m offering a candidate way to cash-out that usage without committing the Mind Projection Fallacy.
The fallacy here is thinking there’s a difference between the way the ideal gas laws emerge from particle physics, and the way intelligence emerges from neurons and neurotransmitters. I’ve only heard “emergent” used in the following way:
A system X has emergent behavior if we have heuristics for both a low-level description and a high-level description, and the high-level description is not easily predictable from the low-level description
For instance, gliders moving across the screen diagonally is emergent in Conway’s Life.
The “easily predictable” part is what makes emergence in the map, not the territory.
Er, did you read the grandparent comment?
Yes. My point was that emergence isn’t about what we know how to derive from lower-level descriptions, it’s about what we can easily see and predict from lower-level descriptions. Like Roko, I want my definition of emergence to include the ideal gas laws (and I haven’t heard the word used to exclude them).
Also see this comment.
For what it’s worth, Cosma Shalizi’s notebook page on emergence has a very reasonable discussion of emergence, and he actually mentions macro-level properties of gas as a form of “weak” emergence:
To define emergence as it is normally used, he adds the criterion that “the new property could not be predicted from a knowledge of the lower-level properties,” which looks to be exactly the definition you’ve chosen here (sans map/territory terminology).
Let’s talk examples. One of my favorite examples to think about is Langton’s Ant.
If we taboo “emergence” what do we think is going on with Langton’s Ant?
We have one description of the ant/grid system in Langton’s Ant: namely, the rules which totally govern the behavior of the system. We have another description of the system, however: the recurring “highway” pattern that apparently results from every initial configuration tested. These two descriptions seem to be connected, but we’re not entirely sure how (The only explanation we have is akin to this: Q: Why does every initial configuration eventually result in the highway pattern? A: The rules did it.) That is, we have a gap in our map.
Since the rules, which we understand fairly well, seem on some intuitive sense to be at a “lower level” of description than the pattern we observe, and since the pattern seems to depend on the “low-level” rules in some way we can’t describe, some people call this gap “emergence.”
I recall hearing, although I can’t find a link, that the Langton Ant problem has been solved recently. That is, someone has given a formal proof that every ant results in the highway pattern.