In line with previous comments, I’d always understood the idea of emergence to have real content: “systems whose high-level behaviors arise or ‘emerge’ from the interaction of many low-level elements” as opposed to being centrally determined or consciously designed (basically “bottom-up” rather than “top-down”). It’s not a specific explanation in and of itself, but it does characterise a class of explanations, and, more importantly, excludes certain other types of explanation.
This comment hits the bullseye. The general idea of emergence is primarily useful is in pointing out that when we don’t understand something, there are still alternative explanations to those that superstitiously posit a near-omniscience or that pretend to have information or an ability to model complex phenomena that one does not in fact have. So, for example, a highly improbable organism does not imply a creator, a good law does not imply a legislator, a good economy does not require an economic planner, and so on, because such things can be generated by emergent processes. To come to such a conclusion does not require that we have first reasoned out the specific process by which the object in question emerged. Indeed if we had, we wouldn’t have to invoke emergence any more but rather some more specific algorithm, such as natural selection to explain the origins of species.
For this reason, I strongly disagree with the following definition
Let K(.) be Kolmogorov complexity. Assume you have a system M consisting of and fully determined by n small identical parts C. Then M is ‘emergent’ if M can be well approximated by an object M’ such that K(M’) << n*K(C).
Because it is just in situations where a phenomenon has a not highly reducible complexity—where M is not fully determined by n small identical particles, or where it is but K(M’) is not substantially smaller than n*K(C) -- that the idea that a phenomenon is emergent, rather than the product of a near-omniscient or near-omnipotent creator, is most useful.
I’d add that the belief that any important phenomenon is highly reducible, or that even if it is reduceable that humans are capable of undertaking that reduction, are two other species of superstition. These are just as pernicious as the related superstition of the near-omniscient creator. In many, perhaps most cases of interest we either have to be satisfied with regarding a phenomenon as “emergent” or we have to superstitiously pretend that some being has information or a capability of reduction that it does not in fact have.
In line with previous comments, I’d always understood the idea of emergence to have real content: “systems whose high-level behaviors arise or ‘emerge’ from the interaction of many low-level elements” as opposed to being centrally determined or consciously designed (basically “bottom-up” rather than “top-down”). It’s not a specific explanation in and of itself, but it does characterise a class of explanations, and, more importantly, excludes certain other types of explanation.
This comment hits the bullseye. The general idea of emergence is primarily useful is in pointing out that when we don’t understand something, there are still alternative explanations to those that superstitiously posit a near-omniscience or that pretend to have information or an ability to model complex phenomena that one does not in fact have. So, for example, a highly improbable organism does not imply a creator, a good law does not imply a legislator, a good economy does not require an economic planner, and so on, because such things can be generated by emergent processes. To come to such a conclusion does not require that we have first reasoned out the specific process by which the object in question emerged. Indeed if we had, we wouldn’t have to invoke emergence any more but rather some more specific algorithm, such as natural selection to explain the origins of species.
For this reason, I strongly disagree with the following definition
Let K(.) be Kolmogorov complexity. Assume you have a system M consisting of and fully determined by n small identical parts C. Then M is ‘emergent’ if M can be well approximated by an object M’ such that K(M’) << n*K(C).
Because it is just in situations where a phenomenon has a not highly reducible complexity—where M is not fully determined by n small identical particles, or where it is but K(M’) is not substantially smaller than n*K(C) -- that the idea that a phenomenon is emergent, rather than the product of a near-omniscient or near-omnipotent creator, is most useful.
I’d add that the belief that any important phenomenon is highly reducible, or that even if it is reduceable that humans are capable of undertaking that reduction, are two other species of superstition. These are just as pernicious as the related superstition of the near-omniscient creator. In many, perhaps most cases of interest we either have to be satisfied with regarding a phenomenon as “emergent” or we have to superstitiously pretend that some being has information or a capability of reduction that it does not in fact have.