Well, these are all either non-problems or solved/solvable ones.
you don’t need to serve pages, only answer queries at a reasonable rate, which the bot seems to be doing pretty well already, with minimal scaling implemented so far.
you can apply the usual caching for identical queries.
you can probably have a light-weight version that bunches similar queries (the metric of similarity is, of course, subject to investigation).
Google hits also give you iffy answers, a lot of them. They are not hallucinations, but not really better in terms of accuracy, usefulness or fake authoritativeness.
ChatGPT is intentionally gimped in terms of up-to-date internet access, easily fixed.
One thing that would complicate the usual scaling approaches is that each instance apparently has a session, and the results of subsequent queries are session-dependent. Then again, the likes of Google Assistant already deal with the same issue.
Sam Altman talked about how the average ChatGPT chat costs a few cents in compute costs. That’s sounds like orders of magnitude more than what a Google query costs in compute costs.
One Quora answer suggests Google earns 0.6 cents per search on average. The answer is old an Google might make more revenue per average search but it’s a good chance that ChatGPT is currently more expensive in compute than a Google search makes revenue.
Hmm, Google has been in business for over a quarter century, and these guys have their first viable product, and the cost difference is only one order of magnitude, and without any scaling optimization? I’d say that Google better get its act together if they don’t want to lose their search business.
...Which is no longer “search”, this is an outdated term. The paradigm has shifted. Almost-natural language virtual assistants were the first signs, and now LLMs are taking over.
Their API has been a viable product before. GitHub Copilot is build on OpenAI codex and is Wa viable product.
and the cost difference is only one order of magnitude
The difference between compute prices and revenue of the Google advertising is within one order of magnitude. Google does not pay as much for compute currently as it makes in revenue per search but likely order of magnitude less.
Well, these are all either non-problems or solved/solvable ones.
you don’t need to serve pages, only answer queries at a reasonable rate, which the bot seems to be doing pretty well already, with minimal scaling implemented so far.
you can apply the usual caching for identical queries.
you can probably have a light-weight version that bunches similar queries (the metric of similarity is, of course, subject to investigation).
Google hits also give you iffy answers, a lot of them. They are not hallucinations, but not really better in terms of accuracy, usefulness or fake authoritativeness.
ChatGPT is intentionally gimped in terms of up-to-date internet access, easily fixed.
One thing that would complicate the usual scaling approaches is that each instance apparently has a session, and the results of subsequent queries are session-dependent. Then again, the likes of Google Assistant already deal with the same issue.
Sam Altman talked about how the average ChatGPT chat costs a few cents in compute costs. That’s sounds like orders of magnitude more than what a Google query costs in compute costs.
One Quora answer suggests Google earns 0.6 cents per search on average. The answer is old an Google might make more revenue per average search but it’s a good chance that ChatGPT is currently more expensive in compute than a Google search makes revenue.
Hmm, Google has been in business for over a quarter century, and these guys have their first viable product, and the cost difference is only one order of magnitude, and without any scaling optimization? I’d say that Google better get its act together if they don’t want to lose their search business.
...Which is no longer “search”, this is an outdated term. The paradigm has shifted. Almost-natural language virtual assistants were the first signs, and now LLMs are taking over.
Their API has been a viable product before. GitHub Copilot is build on OpenAI codex and is Wa viable product.
The difference between compute prices and revenue of the Google advertising is within one order of magnitude. Google does not pay as much for compute currently as it makes in revenue per search but likely order of magnitude less.