For many queries Google has been offering an answer that is not a link for some time. Calculating, graphing, facts, etc. It is becoming more of an answer engine than a search engine, but rather slowly. I assume that Google is working furiously now to catch up with other LLMs UIs, and they are in a good position to do so, if they let go of the Search mentality.
I think I read a thread somewhere which said that Google has a lot of tooling built and many teams already dedicated to integrating LLMs into their products. But the economics don’t make sense at the moment, apparently. The cost of using these models would need to come down by 1-2 OOM before they’d deploy things. And that seems plausible? Like, I haven’t done a detailed analysis, but Davinci is at around $0.1/1000 words, which sounds way too high to use to augment search.
On the other hand, I expect that few people will need Gopher-like models. The mythical average person probably wants to hear what’s new about celebrity X, or a link to a youtuber’s channel or so on. When they need a link to a wikipedia page, or an answer to a pub quiz question, I suspect GOFAI is enough. So maybe cost is slightly less of an issue if only 1/10-1/100 queries needs these models to be used.
Related: seems like some search engines are already integrating LLMs:
- One approach is directly providing links; see https://metaphor.systems, brought up yesterday @ https://www.lesswrong.com/posts/rZwy6CeYAWXgGcxgC/metaphor-systems
- Another is LLM summarization of search engine provided links; https://you.com/search?q=what+was+the+recent+breakthrough+in+fusion+research%3F as an example
For many queries Google has been offering an answer that is not a link for some time. Calculating, graphing, facts, etc. It is becoming more of an answer engine than a search engine, but rather slowly. I assume that Google is working furiously now to catch up with other LLMs UIs, and they are in a good position to do so, if they let go of the Search mentality.
I think I read a thread somewhere which said that Google has a lot of tooling built and many teams already dedicated to integrating LLMs into their products. But the economics don’t make sense at the moment, apparently. The cost of using these models would need to come down by 1-2 OOM before they’d deploy things. And that seems plausible? Like, I haven’t done a detailed analysis, but Davinci is at around $0.1/1000 words, which sounds way too high to use to augment search.
On the other hand, I expect that few people will need Gopher-like models. The mythical average person probably wants to hear what’s new about celebrity X, or a link to a youtuber’s channel or so on. When they need a link to a wikipedia page, or an answer to a pub quiz question, I suspect GOFAI is enough. So maybe cost is slightly less of an issue if only 1/10-1/100 queries needs these models to be used.