For the last year, every other LinkedIn post has been telling agency owners they need to rebuild their SEO around AEO, GEO, llms.txt files, content chunking, schema for AI, and a growing list of new acronyms. This week Google published official guidance on showing up in AI Overviews and AI Mode. The to-do list is short. The skip list is where the document gets interesting.
The short version is that AI Overviews and AI Mode run on the same Search index everyone has been working with for years. Retrieval-augmented generation pulls pages from that index and summarizes them. Query fan-out splits one question into a handful of related sub-queries and pulls more pages. Both depend on the same ranking signals as regular Search, which is why Google flat-out says optimizing for AI search is still SEO. That's the whole framing. If you've been worried your agency missed the boat on some new discipline, you didn't. There's no new boat.
What Google says you can stop worrying about
The myth-busting section is where this document earns its keep. Google walks through a list of advice that's been making the rounds and tells you, with unusual directness, that you can ignore most of it.
Llms.txt comes up first. You don't need it. Google might crawl the file because Google crawls lots of files, but it gets no special treatment. The whole pitch was wishful thinking from people who wanted to sell agencies a new deliverable. Content chunking is also out. Google can handle nuance and multiple topics on one page, so there's no magic word count and no ideal section length. Writing for humans still works. Same story with rewriting content "for AI." The models understand synonyms and intent, and you don't need 14 versions of the same FAQ to cover every possible phrasing. Doing that at scale is what the scaled content abuse policy was written to penalize.
Hunting for inauthentic mentions across random blogs and forums doesn't do what the people selling that service claim. Core ranking systems still filter for quality, spam systems still block spam, and AI Overviews lean on both. Structured data falls into a softer category. It still helps for rich results in regular Search, so don't tear out your schema, but it's not the magic AI ingredient some people have been pitching, and no new schema types are required.
What's missing from the skip list is also worth a beat. EEAT doesn't come up. Neither do "AI authority signals," brand mentions, or any of the new acronyms the AEO industry has been minting. The document goes out of its way to not validate any of it.
The point Google buries in plain sight
The actual driver of visibility in AI Overviews shows up in one paragraph, and it's the part most people will scroll past. Unique, first-hand, non-commodity content matters more than any of the technical work in the rest of the guide.
The example Google uses is worth sitting with. "7 Tips for First-Time Homebuyers" is commodity content. Anyone could write it. The model already has 800 versions of it. "Why We Waived the Inspection and Saved Money, A Look Inside the Sewer Line" is non-commodity. It comes from someone who actually did the thing and learned something specific from it. The first kind of page is what AI summarizers don't need from you, because they already have it from someone else. The second kind is what they reach for, because nobody else has it.
That's a harder bar than most agency content is clearing. Reworded top-three results don't get there. Generic keyword-research listicles don't either. The work has to come from somewhere real, which means either you know the thing or you talk to someone who does and write down what they tell you. It's been true for years. It's just more obviously true now.
The AI-generated content question
Google also tightened up its position on using AI to write content. Research and structure are fine. Mass-produced output you didn't think about is not, and the scaled content abuse policy is the lever they'll pull when it gets out of hand.
Worth reading sections 4.6.5 and 4.6.6 of the Search Quality Raters guidelines if you want the detailed version. Raters don't directly affect rankings, but their feedback trains the systems that do. The bar they're trained to apply is whether a page shows originality, effort, and added value. A lot of agency content doesn't, and that's the part that should be uncomfortable.
Ecommerce got specific rules. AI-generated product images need IPTC metadata flagging them as AI-generated. AI-generated titles and descriptions have to be labeled separately. Merchant Center will enforce this if it isn't already. If you're running shopping feeds for clients, that's a workflow change you need to plan for.
The honest read on AI content is that drafting and research are normal now, and a content factory operation is a slow way to crater a site.
What to do with this
Pick one client site. Pull the five pages that matter most. For each one, ask whether the page says something that isn't already sitting on the first page of Google for the same query. If the answer is no on all five, that's where the work goes. Not into a new text file at the root of the domain.
Put this into practice
Manage your agency smarter
SmartMetrics gives agencies the tools to track client health, automate reporting, run audits, and deliver a fully branded client experience — all in one place.
- No credit card required
- Setup in minutes
- Cancel anytime