Brilliant! I think we often have a hard time seeing these cheat codes. One reason for this might be the following (OK, a bit speculative, granted)
Gladwell or Taleb writes in some book that we have a gut feeling that there should be a correspondence between cause and effect: we have trouble of conceiving of significant effects (e.g. the president of the US being murdered) as having trivial or uninteresting causes (i.e. a lunatic). Instead we postulate that there must be a significant cause—e.g. a Sovjet conspiracy. Similarly, we have trouble conceving of significant events as the result of randomness, but postulate that they must have significant causes.
(I don’t know if there is a name for this bias, or if it indeed has been proven in experiments. It seems intuitive that we have it, though.)
In line with this, it might be that we think that effort (i.e. cause) and benefit (i.e. effect) normally should match each other. If this is right, this would perhaps go some way towards explaining why humanity invented so little for such a long time, even though many inventions that increased productivity were relatively easy to come up with.
Of course people do invent, but it would seem to me that this is to large extent an attitude we’ve learnt from the scientific revolution and onwards—it is not our primary biological disposition.
Also, in many cases it is of course true that attempts at “easy solutions” are misguided (as mentioned in the OP) - so there is some ground for the heuristic that benefit and effort do match (perhaps particularly in a fiercely competetive and inventive society such as today’s Western society). But like all heuristics, it is not always helpful (though it is so on average).
In any case, the fact that we seem to have trouble spotting these cheat codes makes it particularly useful to discuss them as is done here.
Brilliant! I think we often have a hard time seeing these cheat codes. One reason for this might be the following (OK, a bit speculative, granted)
Gladwell or Taleb writes in some book that we have a gut feeling that there should be a correspondence between cause and effect: we have trouble of conceiving of significant effects (e.g. the president of the US being murdered) as having trivial or uninteresting causes (i.e. a lunatic). Instead we postulate that there must be a significant cause—e.g. a Sovjet conspiracy. Similarly, we have trouble conceving of significant events as the result of randomness, but postulate that they must have significant causes.
(I don’t know if there is a name for this bias, or if it indeed has been proven in experiments. It seems intuitive that we have it, though.)
In line with this, it might be that we think that effort (i.e. cause) and benefit (i.e. effect) normally should match each other. If this is right, this would perhaps go some way towards explaining why humanity invented so little for such a long time, even though many inventions that increased productivity were relatively easy to come up with.
Of course people do invent, but it would seem to me that this is to large extent an attitude we’ve learnt from the scientific revolution and onwards—it is not our primary biological disposition.
Also, in many cases it is of course true that attempts at “easy solutions” are misguided (as mentioned in the OP) - so there is some ground for the heuristic that benefit and effort do match (perhaps particularly in a fiercely competetive and inventive society such as today’s Western society). But like all heuristics, it is not always helpful (though it is so on average).
In any case, the fact that we seem to have trouble spotting these cheat codes makes it particularly useful to discuss them as is done here.