Seems to me that there’s rather a large gap between “interestingly powerful” and superhuman in Eliezer’s sense. We like Google Maps because it can come up with fast, general, usually-good-enough solutions to route-planning problems, but I’m nowhere near convinced that Google Maps generates solutions that suitably trained human beings couldn’t if given the same data in a human-understandable format. Particularly not solutions that’re interesting because of their cleverness or originality or other qualities that we generally associate with organic intelligence.
On the other hand, automated theorem provers do exist, and they’ve generated some results that humans haven’t. It’s not inconceivable to me that similar systems could be applied to Rubik’s Cube (or similar) and come up with interesting results, all without doing humanlike research or rewriting their own code. Not that this is a particularly devastating argument within the narrower context of AGI.
ETA: Odd. I really didn’t expect this to be downvoted. If I’m making some obvious mistake, I’d appreciate knowing what it is.
Seems to me that there’s rather a large gap between “interestingly powerful” and superhuman in Eliezer’s sense. We like Google Maps because it can come up with fast, general, usually-good-enough solutions to route-planning problems, but I’m nowhere near convinced that Google Maps generates solutions that suitably trained human beings couldn’t if given the same data in a human-understandable format. Particularly not solutions that’re interesting because of their cleverness or originality or other qualities that we generally associate with organic intelligence.
On the other hand, automated theorem provers do exist, and they’ve generated some results that humans haven’t. It’s not inconceivable to me that similar systems could be applied to Rubik’s Cube (or similar) and come up with interesting results, all without doing humanlike research or rewriting their own code. Not that this is a particularly devastating argument within the narrower context of AGI.
ETA: Odd. I really didn’t expect this to be downvoted. If I’m making some obvious mistake, I’d appreciate knowing what it is.