I don’t know anything about the specific AI architectures in this post, but I’ll defend non-apples. If one area of design-space is very high in search ordering but very low in preference ordering (ie a very attractive looking but in fact useless idea), then telling people to avoid it is helpful beyond the seemingly low level of optimization power it gives.
A metaphor: religious beliefs constitute a very small and specific area of beliefspace, but that area originally looks very attractive. You could spend your whole life searching within that area and never getting anywhere. Saying “be atheist!” provides an trivial amount of optimization power. But that doesn’t mean it’s of trivial importance in the search for correct beliefs. Another metaphor: if you’re stuck in a ditch, the majority of the effort it takes to journey a mile will be the ten vertical meters it takes to climb to the top.
Saying “not X” doesn’t make people go for all non-X equally. It makes them apply their intelligence to the problem again, ignoring the trap at X that they would otherwise fall into. If the problem is pretty easy once you stop trying to sell apples, then “sell non-apples” might provide most of the effective optimization power you need.
I don’t know anything about the specific AI architectures in this post, but I’ll defend non-apples. If one area of design-space is very high in search ordering but very low in preference ordering (ie a very attractive looking but in fact useless idea), then telling people to avoid it is helpful beyond the seemingly low level of optimization power it gives.
A metaphor: religious beliefs constitute a very small and specific area of beliefspace, but that area originally looks very attractive. You could spend your whole life searching within that area and never getting anywhere. Saying “be atheist!” provides an trivial amount of optimization power. But that doesn’t mean it’s of trivial importance in the search for correct beliefs. Another metaphor: if you’re stuck in a ditch, the majority of the effort it takes to journey a mile will be the ten vertical meters it takes to climb to the top.
Saying “not X” doesn’t make people go for all non-X equally. It makes them apply their intelligence to the problem again, ignoring the trap at X that they would otherwise fall into. If the problem is pretty easy once you stop trying to sell apples, then “sell non-apples” might provide most of the effective optimization power you need.