All posts
Guide

What Crawlers Actually See on Your JavaScript Site (And Why It Matters)

Most crawlers can't execute JavaScript. Here's exactly what they see when they visit your React, Vue, or Angular site — with side-by-side comparisons and tests you can run right now.

Eric NeffMarch 20, 20264 min read
Share:PostShare

Two versions of every page

Every JavaScript-rendered website has two faces: the one humans see, and the one bots see.

For a static HTML site, these are identical. The server sends complete HTML, and that's what everyone gets — browsers, crawlers, social bots, AI agents.

For a JavaScript single-page application (React, Vue, Angular, Svelte), these two faces are radically different. The server sends a minimal HTML shell. The browser executes JavaScript to fill it with content. Crawlers that can't execute JavaScript see only the shell.

This gap is the single most impactful SEO and AI visibility issue affecting modern web applications.


The anatomy of a JavaScript page load

Here's what actually happens when different visitors hit a typical React SPA:

What the server sends (raw HTML)

<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="UTF-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0" />
    <title>My SaaS Product</title>
    <meta name="description" content="The best tool for..." />
  </head>
  <body>
    <div id="root"></div>
    <script type="module" src="/assets/index-a1b2c3.js"></script>
  </body>
</html>

Word count visible: ~10 words. Just the title and meta description. No headings, no product content, no features, no pricing, no testimonials.

What the browser renders (after JavaScript)

After JavaScript executes, the same page contains:

  • A complete navigation with links
  • An H1 heading with the product name
  • Feature descriptions across multiple sections
  • Pricing tables
  • Customer testimonials
  • FAQ sections
  • Footer with contact information
  • Structured data (JSON-LD)
  • Open Graph meta tags (if injected via React Helmet)

Word count visible: 800–2,000+ words. All the content that makes your site useful and discoverable.

The gap between these two versions is your visibility gap.


Which crawlers execute JavaScript?

This is the critical question. Here's the definitive answer:

CrawlerExecutes JSWhat it sees
GooglebotYesFull rendered content (usually)
BingbotPartiallyOften misses JS-rendered content
GPTBot (OpenAI)NoRaw HTML only
ChatGPT-UserNoRaw HTML only
ClaudeBot (Anthropic)NoRaw HTML only
PerplexityBotNoRaw HTML only
CCBot (Common Crawl)NoRaw HTML only
Cohere-aiNoRaw HTML only
Meta-ExternalAgentNoRaw HTML only
Bytespider (TikTok)NoRaw HTML only
FacebookExternalHitNoRaw HTML only
TwitterbotNoRaw HTML only
LinkedInBotNoRaw HTML only
WhatsAppNoRaw HTML only
SlackbotNoRaw HTML only
DiscordbotNoRaw HTML only
DuckDuckBotNoRaw HTML only
YandexBotNoRaw HTML only
BaiduspiderNoRaw HTML only

Googlebot is the only major crawler that reliably executes JavaScript. Every single AI crawler, social bot, and most search engine bots read only the raw HTML your server sends.


How to test what crawlers see on your site

Test 1: curl (the quickest check)

curl -s https://your-site.com | wc -w

This shows the word count of your raw HTML. Compare it to the word count you see in the browser. If there's a large gap, you have a visibility problem.

For a more detailed view:

curl -s https://your-site.com | grep -i '<h1\|<h2\|<h3\|<p'

If this returns nothing (or very little), your headings and content are JavaScript-rendered only.

Test 2: Simulate a crawler with curl

Test with a specific bot's user-agent:

curl -s -H "User-Agent: Mozilla/5.0 (compatible; GPTBot/1.0)" https://your-site.com | head -50

This shows exactly what GPTBot sees when it visits your site.

Test 3: Google's URL Inspection Tool

In Google Search Console, use the URL Inspection tool to see how Google renders your page. This is useful for the Googlebot perspective, but remember — Googlebot is the exception, not the rule.

Test 4: CrawlReady audit

Run a CrawlReady audit for the most comprehensive view. It shows:

  • Raw HTML word count vs. rendered word count
  • Exact visibility gap percentage
  • Which specific bots are affected
  • Meta tag analysis (title, description, canonical, OG tags)
  • Heading structure comparison
  • AI crawler accessibility assessment

Real-world example: the visibility gap in numbers

Here's actual audit data from a React SPA we analyzed:

MetricRaw HTML (what bots see)Rendered (what humans see)
Word count101,333
H1 tags01
H2 tags08
Paragraphs024
Links247
Images012
JSON-LD blocks03
Visibility gap99%

This site had solid content, proper heading structure, good structured data — all invisible to 95% of crawlers. The site scored 85/100 on technical SEO fundamentals. The only problem was that none of it was in the initial HTML.


Why this matters beyond SEO

The visibility gap affects more than just search rankings:

Social media previews

When you share a link on LinkedIn, X, Facebook, Slack, or Discord, those platforms send a bot to fetch your page and extract Open Graph tags for the preview card. If your OG tags are injected via JavaScript, the bot gets nothing — and your shared link appears as a blank card with no title, no image, no description.

AI search and discovery

ChatGPT, Perplexity, Claude, and Gemini build their knowledge by crawling websites. If their crawlers can't read your content, these AI systems literally don't know your product exists. When someone asks "What's the best tool for [your category]?" — you won't be in the answer.

Accessibility

Screen readers and assistive technologies sometimes rely on the initial HTML. While most modern screen readers do execute JavaScript, the initial HTML determines what's available immediately vs. what requires waiting for scripts to load.

Performance

The time-to-first-meaningful-content for bots is effectively infinite for a JS-only SPA — they never get the content at all. For Google's Rendering Service, there's a queue and delay between first crawl and JavaScript rendering, which can take days to weeks for new or low-authority sites.


The three solutions

1. Server-side rendering (SSR)

Frameworks like Next.js, Nuxt, SvelteKit, and Remix render pages on the server and send complete HTML to every visitor.

  • Pros: Clean solution, good performance, works for all crawlers
  • Cons: Requires a framework migration if you already have a client-rendered SPA. Cost: $30K–$80K+ and 2–4 months for an existing app.

2. Static site generation (SSG)

Pre-build all pages as static HTML at build time.

  • Pros: Fast, simple, cheap to host
  • Cons: Only works for content that doesn't change per-request. Not viable for dynamic apps.

3. Pre-rendering middleware

Intercept crawler requests and serve them rendered HTML, while human visitors get the normal SPA experience.

  • Pros: No code changes, deploys in minutes, works with any SPA framework
  • Cons: Adds a middleware layer (though edge-based solutions like CrawlReady have negligible latency)

For existing SPAs where a framework migration isn't practical, pre-rendering is the fastest path to fixing the visibility gap.


Take action

The first step is always the same: find out what crawlers actually see on your site.

Run a free CrawlReady audit to get a complete picture of your visibility gap — word counts, meta tags, heading structure, AI crawler accessibility, and specific issues to fix.

If your site has content that matters for search visibility or AI discoverability, and you haven't checked what the raw HTML looks like, you probably have a bigger gap than you think.


This guide covers the crawler visibility landscape as of March 2026. The core principle — most crawlers don't execute JavaScript — has been true since SPAs first appeared and shows no signs of changing, even as AI crawlers proliferate.

Run a free audit and see exactly what Google, ChatGPT, Perplexity, and 20+ crawlers see on your site. Results in 15 seconds.

Run Free Audit
Share:PostShare
#javascript-seo#spa#crawlers#react#vue#angular#ai-visibility