Let me be direct about something before we get into this: a lot of what's written about AI and SEO is either catastrophising (SEO is dead) or dismissing (nothing has changed, keep doing what you were doing). Neither is accurate.
The honest answer is more nuanced. Some things changed significantly. Some things stayed the same. And some things got more important — not less.
Here's my current assessment, based on running SEO campaigns for clients and managing my own publishing network across six sites.
What Hasn't Changed
Let's start here, because it's important context.
Google's index still matters. AI Overviews pull from Google's search index. ChatGPT's web search uses Bing. Perplexity uses its own crawler but also draws on established search signals. If you don't rank in traditional search, you're less likely to be retrieved as a candidate for AI answers. Traditional SEO is the foundation. GEO is built on top of it.
Backlinks still matter. Off-site authority signals are still one of the strongest ranking factors — and they're also GEO signals, because they contribute to your entity's perceived authority.
Technical SEO still matters. Site speed, crawlability, indexation, canonical URLs, structured data. All of these have always mattered for traditional SEO and they matter equally for GEO.
Content quality still matters. Possibly more than ever. The proliferation of low-quality AI-generated content has raised the bar for what gets ranked and cited. Thin, generic, unoriginal content is being filtered out more aggressively — by both traditional algorithms and AI citation systems.
What Has Changed
The search result page looks different
AI Overviews now appear on roughly 15–25% of all Google searches, and that figure is growing. For informational queries — "how to do X", "what is Y", "best Z for W" — the AI answer often occupies the top of the screen before the traditional results.
This has measurable click-through rate implications. Pages that previously ranked #1 for informational queries are seeing lower CTR because the AI Overview satisfies the query before the user reaches the link. That's not going to reverse.
The strategic response is two-part: pursue AI Overview citations directly (so you're the source being synthesised) and shift content investment toward commercial-intent queries where AI Overviews are less common and the click still happens.
Author identity matters more
Google has been pushing E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) for years. In 2026, it matters more than ever — because AI systems are specifically trained to favour content from identifiable, credible sources.
Anonymous content — "written by the marketing team", no byline, no author page — is at a structural disadvantage. Named authors with verifiable profiles, industry credentials, and external citations are what AI systems are trained to trust.
This is not a minor adjustment. It means every piece of content on your site should have a named human author with a profile that establishes their expertise in the subject matter.
Structured data is more valuable
Schema.org structured data was always useful for traditional SEO — it helped search engines understand your content and enabled rich results. For AI systems, it's more than useful: it's how they read your content most cleanly.
A FAQPage schema with clear question-and-answer pairs is far more extractable for an AI than a FAQ section buried in HTML without markup. An Organization schema with your full entity information tells AI systems exactly who you are. A Service schema with pricing gives AI engines the commercial context they need to recommend you.
Most sites are significantly under-structured. This is a genuine competitive advantage for businesses that get this right.
AI crawler access is a new technical requirement
This is new and most businesses haven't thought about it. The major AI systems — OpenAI, Anthropic, Google, Perplexity — all operate their own web crawlers. These crawlers need to be able to access your site to include your content in their training data and retrieval systems.
Many sites block these crawlers by accident. A robots.txt that says "Disallow: /" blocks everything. Security configurations that only allow known user agents block AI crawlers. Cloudflare bot blocking settings that are set too aggressively can block AI crawlers.
The first step in any GEO engagement is checking that each major AI crawler has clean access. This is a five-minute technical check that many businesses have never done.
The competitive landscape for informational content changed
If you were building traffic through high-volume informational blog content — "what is X", "how to Y" — you need to reassess. That traffic is being captured by AI Overviews, and the trend is accelerating.
That doesn't mean stop producing informational content. It means:
- Produce it in a way that earns AI Overview citations (E-E-A-T, structure, depth)
- Invest proportionally more in commercial and transactional content where AI Overviews appear less
- Use informational content to build topical authority that supports your commercial pages
The Practical Strategy in 2026
Here's how I'd summarise the current approach for a business running a serious SEO programme:
Maintain the traditional SEO foundation. Technical health, quality backlinks, properly optimised pages. Nothing about AI changes the importance of this — it only makes it more important, because AI systems retrieve from the search index.
Add E-E-A-T infrastructure. Named authors, author pages with credentials, first-person case studies and experience-based content, organisation entity information. This takes time to build and compound value over months.
Implement structured data comprehensively. Not just for rich results — for machine readability. Every FAQ as FAQPage schema. Every service page with Service schema. Every article with proper Article schema and author attribution. Organization schema at domain level.
Check AI crawler access. Verify that GPTBot, ClaudeBot, PerplexityBot, and GoogleOther can access your site. Fix anything that blocks them.
Monitor AI Overview appearances. Know which queries in your sector trigger AI answers. Know which competitors get cited. Know what content type and format is selected. Then produce better versions of what's being cited.
Build off-site entity signals. Digital PR, industry directories, Wikipedia where warranted, professional profiles that create a clear web of references around your business entity.
My Honest Take
Traditional SEO is not dead. It's the prerequisite for GEO. But if you're running an SEO programme in 2026 that doesn't account for AI search at all, you're leaving a meaningful competitive advantage on the table.
The window to build early AI authority is still open. It won't be in twelve months.
The businesses I'm most concerned about are the ones doing content marketing at volume without E-E-A-T infrastructure. Their traffic from informational queries is being eroded by AI Overviews and they're publishing more content to compensate — which doesn't solve the underlying problem.
The fix isn't more content. It's better structured, better attributed, more authoritative content — combined with the technical signals that make AI engines trust and cite you.
That's GEO. And it works alongside traditional SEO, not instead of it.
Related reading: What is Generative Engine Optimisation (GEO)?, The Alotek Case Study — AI Overview in 30 Days.

Written by Damian Roche
Founder & CEO, Churchtown Media
20+ years building websites, 15+ years obsessing over SEO. Based in Southport, helping North West businesses turn traffic into revenue with Next.js and data-driven strategies.
More about DamianNeed Help With This Strategy?
We implement these exact strategies for North West businesses. Get a free audit and see where you stand.
Get Your Free AuditNeed help implementing this?
We help Southport businesses turn these strategies into revenue.
Book Strategy Call
