Well, if we’re counting things like that, this thread becomes much less interesting. They can offload math queries to a Mathematica-like software or chess playing to included Stockfish but we already know software can do this and we’re interested in novel capabilities of language or multi-modal models.
The difference is that LaMDA/WebGPT are learning autonomously to make general use of tools (or tool AIs) provided to them as agent AIs, which is much more useful than giant piles of human-hand-engineered heuristics like in toys like Alexa or Wolfram Alpha. In my example, no one would have programmed it to know it should query Google for the current date, it has learned to exploit Google’s various features on its own, which is no more illegitimate than learning to call date (or a human learning to look at a clock, for that matter), and will extend to any other tools provided it like a Python REPL in inner-monologue work.
Sure, it would be useful, especially if they’re gunning it to become a general chatbot assistant to take on Alexa or Google Home. But recognizing a factual query and offloading it to Google has been done by these assistants for years and it’s not something that anybody would find impressive anymore, even if the classifier was a part of a larger net.
Well, if we’re counting things like that, this thread becomes much less interesting. They can offload math queries to a Mathematica-like software or chess playing to included Stockfish but we already know software can do this and we’re interested in novel capabilities of language or multi-modal models.
The difference is that LaMDA/WebGPT are learning autonomously to make general use of tools (or tool AIs) provided to them as agent AIs, which is much more useful than giant piles of human-hand-engineered heuristics like in toys like Alexa or Wolfram Alpha. In my example, no one would have programmed it to know it should query Google for the current date, it has learned to exploit Google’s various features on its own, which is no more illegitimate than learning to call
date
(or a human learning to look at a clock, for that matter), and will extend to any other tools provided it like a Python REPL in inner-monologue work.Sure, it would be useful, especially if they’re gunning it to become a general chatbot assistant to take on Alexa or Google Home. But recognizing a factual query and offloading it to Google has been done by these assistants for years and it’s not something that anybody would find impressive anymore, even if the classifier was a part of a larger net.
The ability for the AI to use tools like that is both impressive and useful, though.