Google’s AI Overviews feature changes how search results work. Instead of linking out, Google increasingly answers queries directly. That means fewer clicks for site owners - and potentially a fundamental shift in the web’s traffic economy.
This isn’t theoretical. Public company earnings calls already show the impact:
- IAC Group: “The portion of our traffic that comes from Google Search has declined from 52% to 28%”
- Groupon: "Google is significantly changing the behavior of the search result pages. What we see is that we have declining traffic."
- CarGurus: “on Google, when Google provides the AI response, there's a much lower click rate”
However, as some pointed out in the replies this isn’t universal. Booking.com said: “if you look at the performance marketing channels, it's actually to some extent, interesting that the Google clicks continue to hold up quite well”.
Google defends the changes, claiming in a recent blog post, “AI in Search is driving more queries and higher quality clicks”:
Overall, total organic click volume from Google Search to websites has been relatively stable year-over-year. Additionally, average click quality has increased and we’re actually sending slightly more quality clicks to websites than a year ago (by quality clicks, we mean those where users don’t quickly click back — typically a signal that a user is interested in the website). This data is in contrast to third-party reports that inaccurately suggest dramatic declines in aggregate traffic — often based on flawed methodologies, isolated examples, or traffic changes that occurred prior to the roll out of AI features in Search.
Historically, the deal was fair:
- You allow Google to crawl your site.
- Google shows snippets in search results.
- You get traffic in return.
Sometimes, algorithm changes hurt, but overall, the traffic upside made it worth it.
The equation may be shifting.
Crawling for AI
Google now uses your content not just for search, but for AI training and grounding (feeding Gemini models context at query time). You can opt out by blocking Googlebot - but that also removes you from search results entirely.
Compare this with OpenAI, which offers granular control:
Bot |
Purpose |
User agent string |
OAI-SearchBot |
Search index for ChatGPT (like Google’s index) |
OAI-SearchBot/1.0; +https://openai.com/searchbot |
ChatGPT-User |
When ChatGPT visits your site in real-time to provide an answer to a user question. |
Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; ChatGPT-User/1.0; +https://openai.com/bot |
GPTBot |
AI model training |
Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; GPTBot/1.1; +https://openai.com/gptbot |
ChatGPT Agent |
Used when ChatGPT launches a web browser tool to browse your site on behalf of a user (called Agent Mode). |
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36 This is like any modern Chrome browser, but it will also include signature headers according to RFC 9421 (HTTP Message Signatures). |
As a site owner:
- Block GPTBot to stay out of training data.
- Allow OAI-SearchBot to appear in search, but block others.
- Check the Signature-Agent header to control browser automation.
You know exactly what’s hitting your site, and you can make targeted choices.
Developers can code their own user agent lookup (be sure to verify the IPs!) or integrate an in-code security product like Arcjet’s bot detection functionality.
But how do you make those granular choices with Google? If you’re Groupon or CarGurus how do you control where your content shows up in Google search results?
Google’s opacity
Google lists its crawlers, but doesn’t clearly say which power AI Overviews. The only AI-related opt-out is Google-Extended:
…manage whether content… may be used for training future generations of Gemini models… and for grounding… in Gemini Apps and Vertex AI.
Critically, blocking Google-Extended does not remove you from Search - but it doesn’t seem to cover all AI use in search results. There’s no equivalent of “block training but allow index” for AI Overviews.
The coming choice
Right now, most site owners tolerate the trade-off: some traffic is better than none. But what happens when traffic drops below the threshold to justify free content access or AI answers keep users in Google’s ecosystem entirely?
For decades, the web’s implicit contract was: let us crawl you, and we’ll send you traffic. If the traffic stops, the contract is broken.
How long until we need to block Google?