AI, SEO, and the Myth of Google Zero
A few days ago, I was listening to a radio discussion on BFM 89.9. The title of the discussion was "Is ChatGPT Killing Google Search" and so the host and guest were debating whether AI tools like ChatGPT are about to eat Google’s lunch. They spoke of a world where “Google Zero” arrives, the day Google stops sending traffic to websites, and where marketers must scramble to adapt.
It’s the kind of conversation that captures the current anxiety around AI and search. But if we look closely at the claims made, the reality is less sensational and perhaps a bit more practical.

Claim 1: “ChatGPT Drives 80% of AI Referrals”
During the discussion, the host noted that ChatGPT accounts for “around 80% of AI referrals”. In other words, among traffic attributed to AI platforms, ChatGPT is currently dominant.
The risk is in how easily this statement can be misheard or misinterpreted — as if 80% of all web referrals were already coming from ChatGPT. That is not the case.
- ChatGPT vs. Google scale: In February 2023, ChatGPT had about 51M visits, while Google Search had 3.4B. That’s less than 2% of Google’s scale (SimilarWeb).
- Growth without dominance: By June, AI referrals surged 360% YoY to 1.1B, but Google still drove 191B referrals (SimilarWeb Blog).
- Interpretation matters: ChatGPT is the leader among AI platforms, but remains a fraction of Google’s overall traffic power.
“ChatGPT may dominate AI referrals, but search engines still dominate the internet.”
Claim 2: “LLMs Rely on a TXT File Listing All Site Content”
In the discussion, the guest suggested that optimizing for AI might require uploading a file — llm.txt
— that lists all your site’s content. The idea is that large language models (LLMs) would crawl this file to better understand your site.
It sounds plausible, but it reflects a misunderstanding.
What llm.txt
Actually Is
The concept of llm.txt
began as a community-driven proposal, similar in spirit to robots.txt
. The idea is not to feed AI models all your content, but to control how LLMs interact with your site.
- Permission control: Just as
robots.txt
tells search engine crawlers which pages they can or cannot index,llm.txt
is meant to tell AI crawlers (from OpenAI, Anthropic, Perplexity, etc.) which parts of a site they are allowed to use. - Transparency: It gives website owners a way to opt-in or opt-out of having their content scraped for AI training or retrieval.
- Industry adoption: Some AI companies, like OpenAI, have said they will respect
llm.txt
directives. Others are experimenting with their own crawling standards.
Why It’s Not an Optimization Tool
Where the radio discussion went astray was in presenting llm.txt
as a content map: A file that lists everything you want an AI to see. In reality:
- Retrieval-Augmented Generation (RAG) systems don’t rely on a single text file. They use vector embeddings, databases, and semantic matching to pull relevant information (Prompting Guide: RAG; Wikipedia: RAG).
- An
llm.txt
file does not improve your ranking or authority in LLM responses. It only affects whether your content can be used in the first place. - Optimization still comes from clear structure, depth, and authoritative writing, not from creating a new kind of sitemap.
Think of
llm.txt
as a gatekeeper, not a booster. It controls access, but doesn’t guarantee visibility.
So while llm.txt
is useful for managing permissions, it is not the magic lever for AI visibility. The real work of AI-era SEO remains the same: creating content that positions you as a trusted authority worth citing.
Remember that LLMs are not search engines. They don't collect, digest, categorize and store information on their own. They need to use these techniques to effectively return to you results, and these techniques, yes, you guess it right, are being supplied by search engines:
- Retrieval-Augmented Generation (RAG) relies on vector embeddings and semantic search, not static text dumps (Prompting Guide).
- Context-aware retrieval means AI adapts answers to the query, drawing from multiple curated sources (Wikipedia: RAG).
- A txt file can’t reflect nuance, updates, or authority, and LLMs don't store the information like how databases that seach engine uses do.
Structure still matters, but authority, depth, and clarity matter more. Remember E-E-A-T?
Optimization for AI isn’t about appending a file. It’s about becoming the source worth citing.
Claim 3: “Marketers Must Diversify to Avoid Platform Dependency Risk”
Here’s where the conversation struck gold. Dependency on a single platform has always been risky. The rise of AI doesn’t change that; it reinforces it. It's not only applicable to business, but also to most things generally.
- Single-point failure: Stripe warns that relying on one provider exposes businesses to catastrophic policy or technical shifts (Stripe).
- Real-world fallout: Indie Hackers documents SaaS founders losing entire businesses when Facebook or Google changed their rules overnight (Indie Hackers).
- Diversification: Building multi-platform strategies, maintaining owned channels like newsletters, and reducing reliance on one algorithm are proven defenses. Some people say marketing emails are dead, and they would want you to believe so but we need to own our own distribution channels and email is universal. So folks, don't stop building relationships and your email lists!
For marketers, the rise of AI is not about choosing between Google or ChatGPT. It’s about ensuring that no single algorithm dictates your reach.
Claim 4: “AI-Generated Content Will Blur the Line Between Real and Fake”
The discussion also touched on the blurring of real and artificial content, a deeper shift in how we consume media.
- Synthetic media is advancing: Tools like Google’s Veo 3 can generate near-photorealistic video, making detection harder (ACM).
- Detection is an arms race: While AI-generated text sometimes shows anomalies, researchers admit detection tools are constantly playing catch-up (Wharton).
- Storytelling endures: Machines can create text and video, but authentic human narrative still resonates in ways algorithms cannot.
In a flood of machine-made content, authentic storytelling becomes more valuable, not less.
The Bigger Picture
The radio talk painted AI as a looming existential threat to SEO (and search engines). But the evidence points to a more grounded reality:
- ChatGPT dominates AI referrals, but Google still dominates the web.
- LLMs thrive on authority and structure, not txt files.
- Platform dependency is a real risk: Diversification is survival.
- Storytelling remains the ultimate differentiator.
Marketers shouldn’t fear “Google Zero” as an apocalypse. The more realistic future is that AI becomes another powerful channel: One that rewards the same fundamentals SEO has always prized: depth, authority, clarity, and trust.
We must not forget that Google's dominance in search and as our gateway to the web is not as much as due to their superiority or our own behaviors, but is also due to their ability to control the market by paying billions to distribution channels. This is being challenged now, but I suspect that if ChatGPT gets too much on their nerves, they will just pay billions more. They can and I don't doubt it that they will.
So folks, the blogs, FAQs, and thoughtful content that establish expertise will continue to matter. In fact, in the AI era, they may matter more than ever.