It seems humans, even groups of humans, are not capable of fast recursive self-improvement. What is it that is missing that doesn’t allow one of them to prevail?
I would guess that the reason is people don’t work with exact numbers, only with approximations. If you make a very long equation, the noise kills the signal. In mathematics, if you know “A = B” and “B = C” and “C = D”, you can conclude that “A = D”. In real life your knowledge is more like “so far it seems to me that under usual conditions A is very similar to B”. A hypothetical perfect Bayesian could perhaps assign some probability and work with it, but even our estimates of probabilities are noisy. Also, the world is complex, things do not add to each other linearly.
I suspect that when one tries to generalize, one gets a lot of general rules with maybe 90% probabilities. Try to chain dozen of them together, and the result is pathetic. It is like saying “give me a static point and a lever and I will move the world” only to realize that your lever is too floppy and you can’t move anything that is too far and heavy.
I would guess that the reason is people don’t work with exact numbers, only with approximations. If you make a very long equation, the noise kills the signal. In mathematics, if you know “A = B” and “B = C” and “C = D”, you can conclude that “A = D”. In real life your knowledge is more like “so far it seems to me that under usual conditions A is very similar to B”. A hypothetical perfect Bayesian could perhaps assign some probability and work with it, but even our estimates of probabilities are noisy. Also, the world is complex, things do not add to each other linearly.
I suspect that when one tries to generalize, one gets a lot of general rules with maybe 90% probabilities. Try to chain dozen of them together, and the result is pathetic. It is like saying “give me a static point and a lever and I will move the world” only to realize that your lever is too floppy and you can’t move anything that is too far and heavy.