Table of Contents
There's a growing rush to optimise websites for AI search engines — ChatGPT, Perplexity, Claude, Google AI Overviews. But there's a prerequisite that most businesses are skipping entirely: basic SEO. If Google can't find you, AI search engines won't find you either.
This isn't a theoretical concern. After auditing hundreds of websites, a consistent pattern emerges: businesses that score poorly on AI visibility almost always have underlying SEO problems they haven't addressed. The AI-specific work can't compensate for a broken foundation.
The Foundation Problem
The conversation around AI search visibility has moved fast. llms.txt files, structured data for AI, entity optimisation, citation tracking — these are real and important topics. But they're layer-two work. Layer one is whether search engines of any kind can find, crawl, and understand your website.
The uncomfortable truth is that most websites obsessing over AI visibility are invisible to everyone. Not because of some sophisticated AI-specific gap, but because of missing meta descriptions, broken internal links, absent sitemaps, or duplicate content that prevents any crawler — Google, Bing, or AI — from indexing them properly.
Approximately 70% of small and mid-sized business websites have basic SEO issues that undermine their discoverability in both traditional and AI search.
Fixing these issues isn't glamorous. It doesn't feel cutting-edge. But it's the single most impactful thing most businesses can do for their AI search visibility, because it fixes the foundation that everything else is built on.
What the Data Shows
The MASTERY-AI Framework evaluates websites across 27 factors — 18 focused on AI readiness and 9 on SEO foundations. When we look at the scores in aggregate, a clear pattern appears:
Websites with strong SEO foundations (clean URLs, proper heading hierarchies, meta descriptions, fast load times, mobile responsiveness) typically score around 67% on AI readiness factors — even without any AI-specific optimisation. Their content is structured, their sites are crawlable, and AI systems can extract meaningful information from them.
Websites with weak SEO foundations score poorly on both. Their AI readiness scores are low, but so are their traditional search rankings. The AI-specific work they might do — adding llms.txt, implementing FAQ schemas, building entity authority — is undermined by the fact that crawlers struggle to access and understand their content in the first place.
The Numbers Don't Lie
AI search traffic currently accounts for approximately 2–5% of total search volume. Traditional Google search still delivers roughly 95% of organic traffic for most websites. Optimising exclusively for the 5% while ignoring the 95% is not a strategy — it's a distraction.
That doesn't mean AI search isn't important. It's growing rapidly, and the businesses that prepare for it now will have a significant advantage. But preparing for AI search on top of a broken SEO foundation is like installing solar panels on a house with a leaking roof. The panels might be great technology, but the roof still leaks.
How AI Engines Crawl Your Site
A common misconception is that AI search engines have fundamentally different crawling technology from Google. They don't — at least not at the discovery and indexing layer.
AI engines rely on web crawlers to discover and access content, just like Google does. They read your HTML, follow your links, respect your robots.txt, and use your sitemap to understand your site structure. If your robots.txt blocks important pages, if your sitemap is missing or outdated, or if your internal linking is broken — AI crawlers hit the same walls that Google does.
The difference comes at the processing layer. Once an AI engine has your content, it processes it differently from Google — extracting answers, building entity relationships, and generating citations rather than simply ranking pages. But it can only process content it can access. And access depends on the same technical foundations that determine Google's ability to crawl your site.
Three Scenarios, Three Strategies
When we audit websites for AI visibility, they tend to fall into one of three scenarios. Each requires a different strategy:
Scenario 1: High AI Readiness, Low SEO Foundation
These websites have invested in AI-specific optimisation — llms.txt files, structured data, named authorship — but have weak technical SEO. They might have impressive AI readiness scores on paper, but the content AI systems need to access is gated behind crawling issues.
Strategy: Fix the foundation first. Address missing meta descriptions, broken links, missing sitemaps, and page speed issues before investing further in AI-specific signals. The AI work you've already done will start producing results once crawlers can actually reach your content.
Scenario 2: Low AI Readiness, High SEO Foundation
These websites rank well on Google and have solid technical SEO, but haven't implemented any AI-specific optimisation. They're already 80% of the way to strong AI visibility because their content is accessible, well-structured, and crawlable.
Strategy: Add AI-specific signals on top of your strong foundation. Create a llms.txt file, add FAQ schemas, implement Person and Organization structured data, and begin tracking your AI citation presence. You'll likely see results quickly because the groundwork is already in place.
Scenario 3: Low AI Readiness, Low SEO Foundation
These websites have significant work to do on both fronts. The temptation is to tackle both simultaneously, but this typically leads to neither being done well.
Strategy: Address SEO foundations first, then layer AI optimisation on top. A sequential approach ensures each step is built on solid ground. Start with technical SEO (crawlability, speed, mobile), then move to content SEO (structure, metadata, internal linking), and finally add AI-specific signals (llms.txt, entity data, citation optimisation).
The SEO Foundation Checklist
Before investing in AI search optimisation, verify that these fundamentals are in place:
Technical Foundations
- XML sitemap — submitted to Google Search Console, includes all important pages, excludes thin or duplicate content
- Robots.txt — not blocking important content, allows crawlers to access key pages and resources
- Page speed — Core Web Vitals passing (LCP under 2.5s, CLS under 0.1, INP under 200ms)
- Mobile responsiveness — all content accessible and usable on mobile devices
- HTTPS — secure connection across all pages with no mixed content warnings
- Clean URL structure — descriptive, readable URLs without excessive parameters or dynamic strings
Content Foundations
- Unique meta descriptions — every important page has a distinct, accurate meta description
- Proper heading hierarchy — single H1 per page, logical H2/H3/H4 structure
- Internal linking — clear pathways between related content, no orphaned pages
- Canonical tags — properly set on pages to prevent duplicate content indexing
- No duplicate content — www vs non-www resolved, HTTP redirects to HTTPS, no index-bloating pages
- Image alt text — descriptive alternative text on all meaningful images
If any of these are missing or broken, fix them before moving to AI-specific optimisation. Every item on this list also directly benefits AI crawlers, so the work isn't wasted — it's foundational.
Building AI Visibility on a Solid Foundation
Once your SEO foundation is solid, AI-specific optimisation becomes dramatically more effective. The structured data you add gets crawled. The llms.txt file you create gets accessed. The entity authority you build gets indexed.
The AImpactScanner (opens in new tab) audit was designed with this dual perspective in mind. It evaluates both AI readiness factors and SEO foundations, providing contextual guidance based on which area needs attention first. A website that scores well on AI factors but poorly on SEO gets different recommendations from one that scores well on SEO but needs AI-specific work.
The sequence matters more than the speed. A website that fixes its SEO foundation in month one and adds AI signals in month two will outperform a website that tries to do both simultaneously and does neither well.
AI search is growing. Traditional search still dominates. The businesses that will win in AI search are the ones whose foundations are strong enough to support it.
Check your SEO foundation and AI readiness together
Get a free audit across 27 factors — 18 AI readiness + 9 SEO foundations.