I don’t know of a great way to phrase this so it doesn’t get mixed up with notions of personal productivity, but the basic idea here is that you are yourself a complex system made of many distributed parts that pass information around and so you should expect if you try to operate above 60% of your maximum capacity you’ll experience problems.
Take physical exercise, for example. If you are doing about 60% of what you are capable of doing you’ll probably be able to do it for a long time because you are staying safely below the point of exhausting energy reserves and damaging cells/fibers faster than they can be repaired. Of course this is a bit of a simplification, because different subsystems have different limits and you’ll run into problems if you work any subsystem over 60% capacity, so your limiting factor is probably not respiration but something related to repair and replacement, thus you may have to operate overall at less than 60% capacity to keep the constraining subsystem from being overworked. Thus you can, say, walk 20 miles at a slow pace no problem and no need to rest but will tire out and need to rest after running 5 miles at top speed.
Same sort of thing happens with mental activities, like if you concentrate too hard for too long you lose the ability to concentrate for a while but if you concentrate lightly you can do it for a long time (think trying to read in a noisy room vs. trying to read in a quiet room). It doesn’t really matter how this happens (ego depletion? homeostatic regulation encouraging you to meet other needs?), the point is there is something getting overworked that needs time to clear and recover.
To sneak in an extra observation here, it’s notable where this doesn’t happen or only rarely happens. Systems that need high uptime tend to be highly integrated such that information doesn’t have to be shared but instead contain a lot of mutual information. For example, the respiration system doesn’t share information between its parts so much as it is a tightly integrated whole that immediately responds to changes to any of its parts because it can have zero downtime so it has to keep working even when parts are degraded and being repaired. But high integration introduces quadratic complexity scaling with system size, so such tight integration is limited by how much complexity the system can manage without being subject to unexpected complex failures. Beyond some scaling limit of what evolution/engineering can manage, systems can only get more complex by being less integrated than their subsystems, and so you run into this capacity issue everywhere we are not so unlucky as to need 100% uptime.
Comparing peak to sustainable running speeds: the world marathon record’s average speed is 55% of the world 100m sprint record. And for the Olympic men’s qualifying times, the ratio is 54%. Both quite close to 60%
Could you elaborate on this bit? Or maybe give an example of what you mean?
I don’t know of a great way to phrase this so it doesn’t get mixed up with notions of personal productivity, but the basic idea here is that you are yourself a complex system made of many distributed parts that pass information around and so you should expect if you try to operate above 60% of your maximum capacity you’ll experience problems.
Take physical exercise, for example. If you are doing about 60% of what you are capable of doing you’ll probably be able to do it for a long time because you are staying safely below the point of exhausting energy reserves and damaging cells/fibers faster than they can be repaired. Of course this is a bit of a simplification, because different subsystems have different limits and you’ll run into problems if you work any subsystem over 60% capacity, so your limiting factor is probably not respiration but something related to repair and replacement, thus you may have to operate overall at less than 60% capacity to keep the constraining subsystem from being overworked. Thus you can, say, walk 20 miles at a slow pace no problem and no need to rest but will tire out and need to rest after running 5 miles at top speed.
Same sort of thing happens with mental activities, like if you concentrate too hard for too long you lose the ability to concentrate for a while but if you concentrate lightly you can do it for a long time (think trying to read in a noisy room vs. trying to read in a quiet room). It doesn’t really matter how this happens (ego depletion? homeostatic regulation encouraging you to meet other needs?), the point is there is something getting overworked that needs time to clear and recover.
To sneak in an extra observation here, it’s notable where this doesn’t happen or only rarely happens. Systems that need high uptime tend to be highly integrated such that information doesn’t have to be shared but instead contain a lot of mutual information. For example, the respiration system doesn’t share information between its parts so much as it is a tightly integrated whole that immediately responds to changes to any of its parts because it can have zero downtime so it has to keep working even when parts are degraded and being repaired. But high integration introduces quadratic complexity scaling with system size, so such tight integration is limited by how much complexity the system can manage without being subject to unexpected complex failures. Beyond some scaling limit of what evolution/engineering can manage, systems can only get more complex by being less integrated than their subsystems, and so you run into this capacity issue everywhere we are not so unlucky as to need 100% uptime.
Comparing peak to sustainable running speeds: the world marathon record’s average speed is 55% of the world 100m sprint record. And for the Olympic men’s qualifying times, the ratio is 54%. Both quite close to 60%