You’re imagining that Google stays the same in the way it indexes and presents the web. What if it decides people like seeing magic answers to all their questions, or notices that consumers have a more favorable opinion of Google if Google appears to index all the answers to their questions, and so Google by default asks gpteeble (or whatever) to generate a page for every search query, as it comes in, or maybe every search query for which an excellent match doesn’t already exist on the rest of the web.
Imagine Google preloads the top ten web pages that answer to your query, and you can view them in a panel/tab just by mouse-overing the search results. You mouse-over them one by one until you find one that seems relevant, but it’s not one that Google retrieved from a web search but one that Google or a partner generated in response to your query. It looks just the same. Maybe you don’t even look at the URL most of the time to notice it’s generated (the UI has gone more thumbnaily, less texty). Maybe “don’t be evil” Google puts some sort of disclaimer on generated content that’s noticeable, but the content still seems good enough for the job to all but the most critically discerning readers (the same way people often prefer bullshit to truth today, but now powered by AI; “it’s the answer I hoped I’d find”), and so most of us just tune out the disclaimer.
It seems that either (a) the AI-powered sites will in fact give more useful answers to questions, in which case this change might actually be beneficial, or (b) they will give worse answers, in which case people won’t be likely to use them. Don’t you think people will stop trusting such sites after the first 5 times they try eating their own toenails to no avail? And for the purposes of finding plausible bullshit to support what you already think, I think gpt-powered sites have the key disadvantage of being poor evidence to show other people: it looks pretty bad for your case if your best source is a generated website(normal websites could also be generated but not advertise it, of course, but that’s a separate matter). You seem to be imagining a future in which Google does the most dystopian thing possible for no reason in particular.
You’re imagining that Google stays the same in the way it indexes and presents the web. What if it decides people like seeing magic answers to all their questions, or notices that consumers have a more favorable opinion of Google if Google appears to index all the answers to their questions, and so Google by default asks gpteeble (or whatever) to generate a page for every search query, as it comes in, or maybe every search query for which an excellent match doesn’t already exist on the rest of the web.
Imagine Google preloads the top ten web pages that answer to your query, and you can view them in a panel/tab just by mouse-overing the search results. You mouse-over them one by one until you find one that seems relevant, but it’s not one that Google retrieved from a web search but one that Google or a partner generated in response to your query. It looks just the same. Maybe you don’t even look at the URL most of the time to notice it’s generated (the UI has gone more thumbnaily, less texty). Maybe “don’t be evil” Google puts some sort of disclaimer on generated content that’s noticeable, but the content still seems good enough for the job to all but the most critically discerning readers (the same way people often prefer bullshit to truth today, but now powered by AI; “it’s the answer I hoped I’d find”), and so most of us just tune out the disclaimer.
It seems that either (a) the AI-powered sites will in fact give more useful answers to questions, in which case this change might actually be beneficial, or (b) they will give worse answers, in which case people won’t be likely to use them. Don’t you think people will stop trusting such sites after the first 5 times they try eating their own toenails to no avail? And for the purposes of finding plausible bullshit to support what you already think, I think gpt-powered sites have the key disadvantage of being poor evidence to show other people: it looks pretty bad for your case if your best source is a generated website(normal websites could also be generated but not advertise it, of course, but that’s a separate matter). You seem to be imagining a future in which Google does the most dystopian thing possible for no reason in particular.
Google already pivoted once to providing machine-curated answers that were often awful (e.g. https://searchengineland.com/googles-one-true-answer-problem-featured-snippets-270549). I’m just extrapolating.