I don’t think I understand what, exactly, is being discussed. Are “dogs” or “flowers” or “people you meet face-to-face” examples of “complicated external things”?
Right, but the goal is to make AGI you can point at things, not to make AGI you can point at things using some particular technique.
(Tangentially, I also think the jury is still out on whether humans are bad fitness maximizers, and if we’re ultimately particularly good at it—e.g. let’s say, barring AGI disaster, we’d eventually colonise the galaxy—that probably means AGI alignment is harder, not easier)
Humans can be pointed at complicated external things by other humans on their own cognitive level, not by their lower maker of natural selection.
I don’t think I understand what, exactly, is being discussed. Are “dogs” or “flowers” or “people you meet face-to-face” examples of “complicated external things”?
Right, but the goal is to make AGI you can point at things, not to make AGI you can point at things using some particular technique.
(Tangentially, I also think the jury is still out on whether humans are bad fitness maximizers, and if we’re ultimately particularly good at it—e.g. let’s say, barring AGI disaster, we’d eventually colonise the galaxy—that probably means AGI alignment is harder, not easier)