Looking away from programming to the task of writing an essay or a short story, a textbook or a novel, the rule holds true: each significant increase in capability requires a doubling, not a mere linear addition.
If this is such a general law, should it not apply outside human endeavor?
You’re reaching. There is no such general law. There is just the observation that whatever is at the top of the competition in any area, is probably subject to some diminishing returns from that local optimum. This is an interesting generalization and could provide insight fuel. But the diminishing returns are in no way magically logarithmic total benefit per effort, no matter that in a few particular cases they are.
The only way to create the conditions for any sort of foom would be to shun a key area completely for a long time, so that ultimately it could be rapidly plugged into a system that is very highly developed in other ways.
every even slightly promising path has had people working on it
This is a nice argument. You still can’t exclude the possibility of potent ingredients or approaches toward AI that we haven’t yet conceived of, but I also generally believe as you do.
You’re reaching. There is no such general law. There is just the observation that whatever is at the top of the competition in any area, is probably subject to some diminishing returns from that local optimum. This is an interesting generalization and could provide insight fuel. But the diminishing returns are in no way magically logarithmic total benefit per effort, no matter that in a few particular cases they are.
This is a nice argument. You still can’t exclude the possibility of potent ingredients or approaches toward AI that we haven’t yet conceived of, but I also generally believe as you do.
Certainly if someone can come up with a potent approach towards AI that we haven’t yet conceived of, I will be very interested in looking at it!