I’m strikethrough-ing this comment for being less kind than the author/post deserves. But it does make true and useful points, and I don’t have the energy to rewrite it to be kinder right now, so I’m not deleting it outright.
The supercooled water example isn’t actually an example of chaos. It’s an example where the system is in a metastable state, and any perturbation causes it to switch to a more-stable state. Stable states are exactly what chaos isn’t.
A better intuition for something chaos-like: imagine that we take add together a whole bunch of numbers, then check whether the result is odd or even. Changing any single number from odd to even, or vice-versa, causes the end result to flip. Chaos is like that: one small perturbation can cause a large-scale change (like changing the path of a hurricane); there are a wide variety of possible small perturbations, any one of which could cause the large-scale outcome to change back and forth between possible outcomes.
there is no reasonable sense in which we can say that a butterfly’s wings can cause a hurricane… Now, this counterfactual definition of causality, formalized by Judea Pearl using tools information theory, similarly gives problems for the classic interpretation of the butterfly effect… To answer this question carefully, it isn’t enough to look at what the world would look like if we removed the butterfly, but held everything else fixed.
Um… no. Removing the butterfly and holding everything else (specifically all other initial conditions/”random” external inputs) fixed is exactly what Pearl’s counterfactual framework says to do here. And that Pearl-style counterfactual does not give any troubles whatsoever interpreting chaos. A small perturbation can indeed cause a macroscopic change, in the exact sense of “cause” formalized by Pearl: the macroscopic change would not have happened without the small perturbation, holding everything else fixed.
There is a perfectly reasonable sense in which small perturbations cause large changes in realistic chaotic systems, and Pearl’s counterfactual framework is exactly the “reasonable sense” in question. If this is “unreasonable” in some sense, then this post has not actually made an argument for that or said in what sense it is unreasonable.
(I do not know if weather systems are chaotic enough that a small perturbation could cause a hurricane to not happen at all, but I’m pretty sure they’re chaotic enough that a small perturbation could cause a hurricane’s path to change significantly, e.g. send it to Florida rather than Mexico or vice-versa.)
(Side-note: in the supercooled water example, the counterfactual analysis would presumably say that any particular impurity did not cause the transition, precisely because there were likely other impurities which would have caused the transition anyway.)
I do think there’s probably a way to turn the thing-you’re-trying-to-say into a viable argument, but it isn’t about chaos. In particular:
it may often be quite difficult to cleanly attribute a real-world outcome to one particular cause.
This is absolutely true. The number-adding analogy shows why: if changing any number would change the outcome, then there isn’t really a sense in which one number caused the outcome more than any other. For each number, there is a well-defined counterfactual in which that number caused the outcome.
Counterfactual analysis—i.e. “holding the world fixed and only changing that one past decision” - is not at all problematic, for butterflies or hurricanes or anything else. The mistake which I think you’re trying to point to is in arbitrarily picking one particular cause to focus on, when any other cause is just as relevant.
I’m not sure why this was crossed out—seems quite civil to me… And I appreciate your thoughts on this!
I do think we agree at the big-picture level, but have some mismatch in details and language. In particular, as I understand J. Pearl’s counter-factual analysis, you’re supposed to compare this one perturbation against the average over the ensemble of all possible other interventions. So in this sense, it’s not about “holding everything else fixed,” but rather about “what are all the possible other things that could have happened.”
I believe that would be an interventional analysis, in Pearl’s terms, not a counterfactual analysis.
I’m not sure why this was crossed out—seems quite civil to me...
I noticed this was only your fourth LW post, and you have the sort of knowledge and mindset which seems likely to yield very interesting posts in the future, so I didn’t want to leave a comment which might discourage writing more posts. I’m glad it didn’t come across too harsh. :)
cool—and I appreciate that you think my posts are promising! I’m never sure if my posts have any meaningful ‘delta’ - seems like everything’s been said before.
But this community is really fun to post for, with meaningful engagement and discussion =)
I’m strikethrough-ing this comment for being less kind than the author/post deserves. But it does make true and useful points, and I don’t have the energy to rewrite it to be kinder right now, so I’m not deleting it outright.
The supercooled water example isn’t actually an example of chaos. It’s an example where the system is in a metastable state, and any perturbation causes it to switch to a more-stable state. Stable states are exactly what chaos isn’t.A better intuition for something chaos-like: imagine that we take add together a whole bunch of numbers, then check whether the result is odd or even. Changinganysingle number from odd to even, or vice-versa, causes the end result to flip. Chaos is like that: one small perturbation can cause a large-scale change (like changing the path of a hurricane); there are a wide variety of possible small perturbations, any one of which could cause the large-scale outcome to change back and forth between possible outcomes.Um… no. Removing the butterfly and holding everything else (specifically all other initial conditions/”random” external inputs) fixed is exactly what Pearl’s counterfactual framework says to do here. And that Pearl-style counterfactual does not give any troubles whatsoever interpreting chaos. A small perturbation can indeed cause a macroscopic change, in the exact sense of “cause” formalized by Pearl: the macroscopic change would not have happened without the small perturbation, holding everything else fixed.There is a perfectly reasonable sense in which small perturbations cause large changes in realistic chaotic systems, and Pearl’s counterfactual framework is exactly the “reasonable sense” in question. If this is “unreasonable” in some sense, then this post has not actually made an argument for that or said in what sense it is unreasonable.(I do not know if weather systems are chaotic enough that a small perturbation could cause a hurricane to not happen at all, but I’m pretty sure they’re chaotic enough that a small perturbation could cause a hurricane’s path to change significantly, e.g. send it to Florida rather than Mexico or vice-versa.)(Side-note: in the supercooled water example, the counterfactual analysis would presumably say that any particular impurity didnotcause the transition, precisely because there were likely other impurities which would have caused the transition anyway.)I do think there’s probably a way to turn the thing-you’re-trying-to-say into a viable argument, but it isn’t about chaos. In particular:This is absolutely true. The number-adding analogy shows why: if changing any number would change the outcome, then there isn’t really a sense in which one number caused the outcomemore thanany other. For each number, there is a well-defined counterfactual in which that number caused the outcome.Counterfactual analysis—i.e. “holding the world fixed and only changing that one past decision” - is not at all problematic, for butterflies or hurricanes or anything else. The mistake which I think you’re trying to point to is in arbitrarily picking one particular cause to focus on, when any other cause is just as relevant.I’m not sure why this was crossed out—seems quite civil to me… And I appreciate your thoughts on this!
I do think we agree at the big-picture level, but have some mismatch in details and language. In particular, as I understand J. Pearl’s counter-factual analysis, you’re supposed to compare this one perturbation against the average over the ensemble of all possible other interventions. So in this sense, it’s not about “holding everything else fixed,” but rather about “what are all the possible other things that could have happened.”
I believe that would be an interventional analysis, in Pearl’s terms, not a counterfactual analysis.
I noticed this was only your fourth LW post, and you have the sort of knowledge and mindset which seems likely to yield very interesting posts in the future, so I didn’t want to leave a comment which might discourage writing more posts. I’m glad it didn’t come across too harsh. :)
cool—and I appreciate that you think my posts are promising! I’m never sure if my posts have any meaningful ‘delta’ - seems like everything’s been said before.
But this community is really fun to post for, with meaningful engagement and discussion =)