All posts
Guide

5 Signs Your Website Has an AI Visibility Problem

Most founders don't know their site is invisible to AI search. Here are 5 warning signs — and a quick test for each one.

Eric NeffMarch 25, 20264 min read
Share:PostShare

Your site might be invisible and you'd never know

The tricky thing about AI visibility problems is that everything looks fine from the outside. Your site loads fast. Users love it. Maybe you're even getting some Google traffic.

But underneath, a growing slice of how people discover products — AI search engines like ChatGPT, Perplexity, Claude, and Gemini — can't see your content at all.

Here are the 5 warning signs that your site has an AI visibility problem, with a quick test for each one.


Sign 1: ChatGPT doesn't know your product exists

The test: Open ChatGPT and ask: "What is [your product name]?"

What a good result looks like: ChatGPT accurately describes your product, what it does, who it's for, and maybe mentions key features or pricing.

What a bad result looks like:

  • "I don't have specific information about [product name]."
  • A generic or incorrect description
  • It describes a competitor instead

Why it happens: ChatGPT's knowledge comes from GPTBot, which crawls websites. If GPTBot can't read your content — because it's rendered by JavaScript that GPTBot can't execute — then ChatGPT has nothing to work with.

The fix: Serve rendered HTML to AI crawlers so they can actually read your content.


The test: Share your site's URL in a private Slack channel, LinkedIn post draft, or Twitter/X compose window. Look at the preview card.

What a good result looks like: A rich preview card with your page title, description, and an image.

What a bad result looks like:

  • Just a bare URL with no preview
  • A generic title like "React App" or "Vite App"
  • Missing description or image
  • Wrong or outdated information

Why it happens: Social media platforms (LinkedIn, X, Facebook, Slack, Discord) send bots to fetch your page when a link is shared. These bots read the raw HTML — they don't execute JavaScript. If your Open Graph tags are injected by React, Vue, or any JS framework, the bots see nothing.

The fix: Either add OG tags directly to your index.html (limited), or use pre-rendering to serve complete HTML to social bots.


Sign 3: Your site is built with React, Vue, Angular, or an AI builder

The test: Check your tech stack. Did you build with:

  • React (Create React App, Vite)
  • Vue (Vue CLI, Vite)
  • Angular
  • Svelte in SPA mode
  • Lovable
  • Bolt.new
  • Base44
  • Any other tool that generates a client-side rendered application

If yes: You almost certainly have a visibility problem.

Why: All of these tools produce single-page applications where content is rendered by JavaScript in the browser. The raw HTML that crawlers receive is an empty shell — typically just a <div id="root"></div> and a script tag.

Quick confirmation: Run curl -s https://your-site.com | wc -w in your terminal. If the word count is under 50, crawlers see almost none of your content.

The exception: If you're using Next.js with SSR, Nuxt with SSR, or SvelteKit in SSR mode, your pages are server-rendered and this issue likely doesn't apply — but verify with the curl test anyway.


Sign 4: You get Google traffic but zero AI referrals

The test: Check your analytics for traffic from these referral sources:

  • chatgpt.com
  • perplexity.ai
  • gemini.google.com
  • claude.ai

What good looks like: Even small numbers (5–50 visits/month) from AI sources indicate that AI systems know about you and are citing your content.

What bad looks like: Zero visits from any AI referral source, despite having content that people would ask AI about.

Why it happens: If AI crawlers can't read your site, AI systems can't cite you in their answers. No citations = no referral traffic.

Caveat: AI referral traffic is still small for most sites. But if you're in a category where people regularly ask AI for recommendations (SaaS tools, dev tools, productivity apps) and you're seeing literally zero AI referrals — that's a signal.


Sign 5: Your robots.txt blocks AI crawlers

The test: Visit https://your-site.com/robots.txt in your browser, or run:

curl https://your-site.com/robots.txt

What to look for:

# This blocks all AI crawlers:
User-agent: GPTBot
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: PerplexityBot
Disallow: /

Some hosting platforms and CMS tools add these blocks by default. WordPress plugins, Cloudflare settings, and server configurations can all inject AI bot blocks without you knowing.

What your robots.txt should say (if you want AI visibility):

User-agent: GPTBot
Allow: /

User-agent: ChatGPT-User
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

Why it matters: If AI crawlers are blocked, they can't visit your site at all — even if your content is properly rendered. This is the most easily fixed visibility problem and the one most often overlooked.


How many signs apply to you?

Signs that applyWhat it means
0Your AI visibility is probably fine. Verify with an audit.
1–2You likely have a partial visibility problem. Worth investigating.
3–4You almost certainly have a significant visibility gap. Act now.
5Your site is invisible to AI search. Fix this immediately.

What to do next

Step 1: Run a comprehensive audit

The CrawlReady audit tool checks everything in one scan:

  • Raw HTML vs. rendered content comparison
  • Exact visibility gap percentage
  • AI crawler accessibility
  • Meta tag analysis
  • robots.txt configuration
  • Heading structure
  • Structured data visibility

It takes 15 seconds and gives you a clear picture of what's working and what's not.

Step 2: Fix the foundation

If your site is a JavaScript SPA (signs 3 applies), the root cause is that crawlers can't read your content. Pre-rendering middleware fixes this without code changes.

Step 3: Update robots.txt

If sign 5 applies, update your robots.txt to allow AI crawlers. This is a 2-minute fix.

Step 4: Re-test

After making changes, run through all 5 tests again. AI systems need time to re-crawl and update their knowledge (typically 2–4 weeks), but robots.txt changes and social preview fixes are immediate.


The cost of waiting

Every day your site is invisible to AI search, you're accumulating AI visibility debt. AI systems that could be recommending your product are recommending competitors instead. Social shares that could be driving traffic are falling flat. AI referral traffic — the fastest-growing discovery channel — is going to zero.

The fixes described here range from 2 minutes (robots.txt) to 1 hour (pre-rendering deployment). The cost of the problem compounds daily.

Start with a free CrawlReady audit to find out exactly where you stand.


These diagnostic tests are current as of March 2026. AI crawler behavior, social bot requirements, and search engine rendering capabilities may change — test regularly to stay current.

Run a free audit and see exactly what Google, ChatGPT, Perplexity, and 20+ crawlers see on your site. Results in 15 seconds.

Run Free Audit
Share:PostShare
#ai-visibility#aeo#seo#diagnosis#spa