If errors were a few percent randomly up or down it wouldn’t matter, but the inaccuracy is not tiny, over long timescales it’s many orders of magnitude, and almost always in the same direction—growth/decay are slower over long term than exponential models predicts.
Oh yes, but it’s not just a prediliction for simple models in the first place, but also a tendency to culturally and cognitively simplify the model we access to use—even if the original model had extensions to handle this case and even to the tune of orders of magnitude of error.
Of course sometimes it may be worth computing an estimate that is (unknown to you) orders of magnitude off, in a very short amount of time. Certainly if the impact of the estimate is delayed and subtle less conscious trade-offs may factor in between cognitive effort to access and use a more detailed model and the consequences of error. Yet another form of akrasia.
If errors were a few percent randomly up or down it wouldn’t matter, but the inaccuracy is not tiny, over long timescales it’s many orders of magnitude, and almost always in the same direction—growth/decay are slower over long term than exponential models predicts.
Oh yes, but it’s not just a prediliction for simple models in the first place, but also a tendency to culturally and cognitively simplify the model we access to use—even if the original model had extensions to handle this case and even to the tune of orders of magnitude of error.
Of course sometimes it may be worth computing an estimate that is (unknown to you) orders of magnitude off, in a very short amount of time. Certainly if the impact of the estimate is delayed and subtle less conscious trade-offs may factor in between cognitive effort to access and use a more detailed model and the consequences of error. Yet another form of akrasia.