Why WordPress Sites Are Becoming Invisible to AI
WordPress themes ship heavy JavaScript that AI crawlers can't execute. Average Lighthouse scores of 40-70 vs Astro's 95+. Plugin bloat wastes crawl budget. Here's the growing gap between WordPress and AI visibility.
WordPress themes ship heavy JavaScript that AI crawlers cannot execute. The average WordPress site scores 40-70 on Lighthouse, compared to 95+ for static site generators like Astro. Plugin bloat creates crawl budget waste, carousels confuse AI extraction, and WP2Static — the main static export tool — is effectively dead. This isn't anti-WordPress. It's about the growing gap between what WordPress delivers and what AI crawlers need.
The core problem: JavaScript rendering
AI crawlers — GPTBot, ClaudeBot, PerplexityBot, Google-Extended — read HTML source code. They do not execute JavaScript. When WordPress themes like Elementor, Divi, or WPBakery render content via JavaScript, that content is invisible to AI search platforms. The page looks full in a browser but empty to a crawler.
This isn't theoretical. Test any JavaScript-heavy WordPress site:
- View page source (Ctrl+U / Cmd+U) — is the content there?
- Disable JavaScript and reload — does the content disappear?
- If content only appears after JS executes, AI crawlers cannot see it
Which WordPress themes are affected
| Theme / Builder | JS dependency | AI crawler visibility |
|---|---|---|
| Elementor | Heavy — renders via JS widgets | Partial to poor |
| Divi | Heavy — custom rendering engine | Partial to poor |
| WPBakery | Heavy — shortcode-based JS rendering | Poor |
| Gutenberg (default) | Moderate — server-side with JS enhancement | Moderate to good |
| GeneratePress / Astra | Light — mostly CSS-based | Good |
| Classic themes | Minimal | Good |
The Lighthouse gap
Lighthouse performance scores are a proxy for how efficiently a site delivers content to crawlers. The gap between WordPress and modern static sites is stark:
| Platform | Average Lighthouse performance | AI crawler experience |
|---|---|---|
| WordPress (Elementor/Divi) | 40-55 | Slow, JS-dependent, partial content |
| WordPress (lightweight theme) | 60-75 | Moderate, some JS issues |
| Squarespace | 50-65 | Moderate, some client rendering |
| Wix | 35-55 | Poor, heavy JS framework |
| Astro (SSG) | 95-100 | Excellent, pure HTML |
| Hugo / Eleventy | 95-100 | Excellent, pure HTML |
| Next.js (static export) | 85-95 | Very good, pre-rendered HTML |
The performance gap directly translates to an AI visibility gap. Faster, cleaner sites get crawled more efficiently and have their content fully extracted.
Plugin bloat and crawl budget waste
The average WordPress site runs 20-30 plugins, each adding JavaScript, CSS, and HTTP requests. AI crawlers have limited crawl budgets — the number of pages they'll fetch in a single visit. Plugin bloat slows response times and wastes crawl budget on non-content resources, meaning fewer of your actual content pages get crawled.
Common crawl budget wasters
- Slider plugins — add 200-500KB of JS per page, content often invisible to crawlers
- Social sharing widgets — external scripts that slow every page
- Live chat widgets — load heavy JS frameworks
- Page builder CSS — Elementor alone adds 300KB+ of CSS
- Analytics stacking — multiple tracking scripts competing for load
- Security plugins — some block AI crawler user agents by default
The carousel problem
77% of UK SME websites use homepage carousels. These are problematic for AI crawlers because:
- Content rotates via JavaScript — crawlers see only the first slide or nothing
- Key messages are hidden in slides 2-5 that never load without JS
- Carousels add significant page weight for minimal informational value
- AI extraction algorithms struggle with rotating content — they need static, stable text
Replace carousels with static, clearly structured content blocks. Every piece of information should be visible in the HTML source without JavaScript.
WP2Static is effectively dead
WP2Static was the main tool for converting WordPress sites to static HTML — which would have solved the JavaScript rendering problem. However, the project is effectively abandoned:
- Last meaningful update was years ago
- Compatibility issues with modern WordPress versions
- Plugin conflicts with popular themes and page builders
- No active maintenance or support
This means WordPress sites cannot easily convert to static HTML for AI crawler compatibility. The alternative is migrating to a static site generator or ensuring your WordPress theme renders content server-side.
No edge deployment integration
Modern static sites deploy to edge networks (Cloudflare, Vercel, Netlify) that offer:
- Markdown for Agents — Cloudflare's feature that serves clean markdown to AI crawlers
- Edge caching — sub-50ms response times globally
- Automatic prerendering — every page is pre-built HTML
- Built-in IndexNow — instant Bing notification on publish
WordPress hosting is typically traditional server-based, without these AI-specific features. While WordPress can be put behind a CDN, it doesn't natively integrate with edge deployment features designed for AI crawlers.
What WordPress site owners should do
This isn't "abandon WordPress." It's about understanding the gap and taking action:
If staying on WordPress
- Switch to a lightweight theme — GeneratePress, Astra, or Kadence instead of Elementor/Divi
- Audit plugin count — remove anything non-essential, target under 15 plugins
- Remove carousels — replace with static content blocks
- Test JavaScript dependency — disable JS and check if content is still visible
- Implement AI-specific schema — manually or via a schema plugin
- Configure robots.txt — ensure AI crawlers aren't blocked by security plugins
- Submit to Bing — most WordPress sites only submit to Google
If considering migration
Static site generators like Astro deliver objectively better AI crawler experiences. If your site is primarily content (blogs, guides, service pages), migration to an SSG deployed on Cloudflare or Vercel will give you:
- 95+ Lighthouse scores vs 40-70
- Pure HTML that every AI crawler can read
- Edge deployment with Markdown for Agents
- Website structure optimised for AI from the ground up
What to do next
Oliver Mackman
AI Search Analyst, SEOCompare
Oliver leads SEOCompare's editorial and comparison research. With over a decade in digital marketing, he oversees agency evaluation, tool testing, and AI search data analysis.
Last reviewed: 7 April 2026
Need help with AI search visibility?
Get a free AI visibility audit to see how your business appears across ChatGPT, Gemini, Perplexity, and AI Overviews.
Request your free audit