Generator Systems: Coincident Constraints
The Prototype Plane Perfecter
You are asked with improving the design of a plane, which keeps falling out of the sky when it reaches 80 mph. This makes sense, as it is a prototype, put together with no knowledge of high-speed flight. Unfortunately they always seem to disappear over the horizon just before failure, and crash in such a huge fireball that no evidence as to the cause can be collected.
You have three hypotheses as to why: the engine is spinning itself apart; the wings are falling off; the instruments are failing. Each of these requires some cost to improve, so your bosses are insistent that you solve the issue, and no more. You make three new planes.
Each of them sets off, and each of them crashes at the exact same distance from the runway (to within random error of course). None of the improvements have increased the top speed.
Naturally, you assume that none of these are constraining it. You look for other reasons the planes might be crashing.
The Mortal Trees
A scattered group of trees wish to live forever. They do not age, so the only threat to them is being toppled by the wind. Each stormy night they all fear terribly for their lives. Their scientist-trees get to work. They find that 80 mph wind is generally fatal.
One school of medical researcher-trees studies trunks. They believe trunk-snapping is the ultimate cause of toppling. With a strong regiment of lignin-enhancers, they manage to fortify themselves physically.
Another school studies roots. They believe that uprooting is the ultimate cause of toppling. With soil rigidification, they ensure their balance is perfect.
The next storm, each waits with baited breath. Both groups find they have lost some of their own, and barely any fewer than the trees in the control group. Those still standing make peace, and accept the inevitability of toppling.
Lessons
In the first story, it is not so clear what’s happening. In the second, it is. The difference is that the generator system for prototype plane designs is very different from the generator system for tree designs.
If a plane is put together with no knowledge of high speed flight, then the chances of all of its systems failing at exactly the same speed is very unlikely. In fact this is the case for most systems. In plant growth, one nutrient is usually limiting, the same is often true for manufacturing. In this case, locating the limiting factor is paramount to improving the system.
The generator system for tree designs is evolution. This is not random. In real life, trees generally do uproot and snap at similar wind speeds. This is because both root stability and trunk strength are metabolically costly, so investing in one being stronger than the other is a poor strategy. A population of trees which uproot at 60 mph but would hypothetically snap at 100 mph will experience two things: if stronger roots are not too costly, genes for these will become more frequent; and genes for weaker trunks will become less frequent. There will be an equilibrium point where winds of a certain strength are so rare that withstanding them is not metabolically costly enough.
Generally, multi-causal models are subject to a significant complexity penalty: if you think 50 different things contributed to falling crime rates in the 90s, you must also explain why all 50 different things happened at the same time. This is not true when the generator system is evolution.
This was written as a direct parallel to ageing. The involution of the thymus gland is basically unrelated to other types of pathological ageing, as far as we can tell. The constraint systems just seem to sort of line up time-wise. The same could be true for other proposed methods of ageing.
Whatever object you’re studying, it will have been created by some generator system: evolution, bad engineering, good engineering, free markets, political design. Understanding the generator system gives you good priors. Depending on how cheap experimentation is (does it cost planes worth of money, or does delay cost many human lives).
This a great and under-appreciated point.
What if there are just a lot of things that happen in general? In particular in modernity, where a lot of stuff has obviously changed due to technology.
If one technological advance—like mobile phones—causes a multitude of small changes, which all push one outcome in the same direction, then that’s sort of a single-cause model in disguise. It still pays a complexity penalty as a hypothesis but a smaller one. On the other hand it is worth asking why the consequences of something all (or almost all) push the lever of crime in a specific direction, and this is not true for other technologies.
If you mean modernity in general leading to a lot of technological advances, then we’re back to the same problem, the ones that decrease crime should be fairly randomly distributed. If we see a big change in crime rate in one period and not anywhere else, then either one factor has a disproportionate impact on crime; or a disproportionate number of crime-decreasing technological changes have occurred at once. The latter pays a complexity penalty.
If you mean a big change over the last ~150 years, then yeah I’d say having lots of causes for certain trends makes sense.
I mean something like, modernity has led to improvements to a whole bunch of different things (and also worsening in some small number of other things). It doesn’t seem all it would be all that surprising to me that improvements would on average have some sort of directional effect, even if a priori predicting that effect (easier to do crime vs easier to prevent crime) is hard.
Ah, I should have clarified what you meant before responding.
I agree with this assessment, but modern-ness has been increasing at a reasonable rate for the past six decades at least. If modernity just caused a bunch of changes with a net effect on crime, we would see a (relatively) steady increase. The time distribution of changes in crime rates tells us something else is going on.
Unless an argument gives good reasons why—for example—there is some property of the 90s that produced an exceptional number of improvements which reduced crime and very few which increased it, as opposed to other decades where the improvements both increased and reduced crime and mostly cancelled out, then that explanation suffers a big complexity penalty.
Even if all the arguments as to why certain technologies decreased crime rather than increasing it seem solid, we should be very suspicious of the coincidence of them all happening at once. That sort of thinking smacks of post-hoc rationalization and the conjunction fallacy.