All posts
Guide

The AI Builder SEO Checklist: 15 Things to Fix After Lovable, Bolt, or Base44 Generates Your Site

Your AI builder shipped a beautiful app. Now make it findable. A prioritized 15-item checklist covering visibility, meta tags, content structure, and performance for AI-generated sites.

CrawlReady TeamMarch 30, 20264 min read
Share:PostShare

Your AI builder handles the building. You handle the finding.

Lovable, Bolt.new, Base44, and other AI builders are incredible at generating working applications. In minutes, you have a functional product with UI, auth, database, and deployment.

What they don't generate: search visibility.

Every AI builder produces a client-side rendered JavaScript application. That means most search engines, all AI crawlers, and every social media bot can't read your content. But the visibility gap goes beyond just rendering — there are 15 specific issues to address.

This checklist is prioritized. Start at the top and work down.


Priority 1: Visibility (fix these first)

These issues make or break whether crawlers can see your site at all. Nothing else matters until these are resolved.

1. Serve rendered HTML to crawlers

Priority: Critical

The problem: Your AI-generated app renders content via JavaScript. Crawlers that don't execute JS (GPTBot, ClaudeBot, PerplexityBot, all social bots) see an empty page.

Check it:

curl -s https://your-site.com | wc -w

If the word count is under 50, crawlers see almost nothing.

Fix it: Deploy pre-rendering middleware like CrawlReady that serves rendered HTML to bots while humans get the normal SPA experience. Takes under 1 hour. Zero code changes.

Alternative: Migrate to an SSR framework (Next.js, Nuxt). Takes 2–4 months and $30K–$80K+.


2. Configure robots.txt to allow AI crawlers

Priority: Critical

The problem: Many sites accidentally block AI bots through default configurations, hosting platform settings, or CMS plugins.

Check it:

curl https://your-site.com/robots.txt

Look for Disallow: / after GPTBot, ClaudeBot, or PerplexityBot user-agents.

Fix it: Create or update your robots.txt:

User-agent: *
Allow: /

User-agent: GPTBot
Allow: /

User-agent: ChatGPT-User
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

Sitemap: https://your-site.com/sitemap.xml

Place this in your public/ directory so it's served at the root of your domain.


3. Generate and submit an XML sitemap

Priority: Critical

The problem: SPAs don't automatically generate sitemaps. Without one, crawlers have to discover your pages by following links — which they can't do if your navigation is JavaScript-rendered.

Check it: Visit https://your-site.com/sitemap.xml. If it 404s, you don't have one.

Fix it: Create a sitemap.xml in your public/ directory listing all important routes:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>https://your-site.com/</loc>
    <lastmod>2026-03-15</lastmod>
  </url>
  <url>
    <loc>https://your-site.com/features</loc>
    <lastmod>2026-03-15</lastmod>
  </url>
  <url>
    <loc>https://your-site.com/pricing</loc>
    <lastmod>2026-03-15</lastmod>
  </url>
  <url>
    <loc>https://your-site.com/blog</loc>
    <lastmod>2026-03-15</lastmod>
  </url>
</urlset>

Submit it to Google Search Console.


Priority 2: Meta Tags (fix these next)

These control how your site appears in search results and social previews.

4. Set a unique, descriptive title tag for every page

Priority: High

The problem: AI builders often set a default title like "Vite + React" or "My App" that shows up in search results.

Check it:

curl -s https://your-site.com | grep -i "<title>"

Fix it: In your index.html, set a descriptive default title. For per-page titles, use React Helmet (note: requires pre-rendering to be visible to bots):

<title>Your Product — A Clear One-Line Description</title>

Guidelines:

  • Under 60 characters (Google truncates longer titles)
  • Include your product name
  • Front-load the most important words
  • Unique per page

5. Write a compelling meta description for every page

Priority: High

The problem: No meta description in the raw HTML means search engines auto-generate one — usually poorly.

Check it:

curl -s https://your-site.com | grep -i "meta.*description"

Fix it: Add to your index.html and set per-page via React Helmet:

<meta name="description" content="A clear, compelling description of your product. Under 160 characters. This is your ad copy in search results." />

Guidelines:

  • Under 160 characters
  • Include your value proposition
  • Write it like a mini ad — this appears in search results

6. Set canonical URLs on every page

Priority: High

The problem: Without canonical URLs, search engines may index duplicate versions of your pages (with/without trailing slash, with query parameters, etc.).

Check it:

curl -s https://your-site.com | grep -i "canonical"

Fix it:

<link rel="canonical" href="https://your-site.com/" />

For SPAs, use React Helmet to set page-specific canonicals:

<Helmet>
  <link rel="canonical" href="https://your-site.com/pricing" />
</Helmet>

Common issues:

  • Old subdomains referenced in canonical
  • Trailing slash inconsistency
  • HTTP vs HTTPS mismatch

7. Add Open Graph tags for social sharing

Priority: High

The problem: Without OG tags in the raw HTML, social previews are blank when you share your link on LinkedIn, X, Slack, or Discord.

Check it: Share your URL in a Slack DM or use Facebook's Sharing Debugger.

Fix it: Add to index.html as defaults, and set per-page via React Helmet:

<meta property="og:title" content="Your Product Name" />
<meta property="og:description" content="A compelling description." />
<meta property="og:image" content="https://your-site.com/og-image.jpg" />
<meta property="og:url" content="https://your-site.com" />
<meta property="og:type" content="website" />
<meta name="twitter:card" content="summary_large_image" />

Image requirements:

  • 1200 x 630px minimum
  • Absolute URL (not relative)
  • No spaces in filename
  • Under 5MB

Priority 3: Content Structure (optimize these for rankings and AI)

8. Use one H1 tag per page

Priority: Medium

The problem: AI builders sometimes generate pages without a clear H1, or with multiple H1 tags.

Check it: In your browser's developer tools, run:

document.querySelectorAll('h1').length

Fix it: Every page should have exactly one <h1> that clearly states what the page is about. This is the most important heading for both search engines and AI systems.


9. Maintain proper heading hierarchy

Priority: Medium

The problem: Heading levels skip (H1 directly to H3) or are used for styling rather than structure.

Check it: Inspect your page headings. They should follow: H1 → H2 → H3 without skipping levels.

Fix it: Use headings for content structure, not visual styling. If you need smaller text, use CSS classes instead of heading tags.


10. Add JSON-LD structured data

Priority: Medium

The problem: AI builders don't generate structured data. Without it, search engines and AI systems have less context about your entities.

Check it:

curl -s https://your-site.com | grep "application/ld+json"

Fix it: Add Organization schema at minimum:

<script type="application/ld+json">
{
  "@context": "https://schema.org",
  "@type": "Organization",
  "name": "Your Product",
  "url": "https://your-site.com",
  "description": "What your product does in one sentence."
}
</script>

Add Product, FAQ, and Article schemas as appropriate. Note: JSON-LD added via React components requires pre-rendering to be visible to non-JS crawlers.


11. Write clear entity descriptions

Priority: Medium

The problem: AI builders generate functional UI but often lack clear, specific descriptions of what the product is and who it's for.

Check it: Ask ChatGPT: "What is [your product name]?" If it doesn't know or gets it wrong, your entity descriptions need work.

Fix it: On your homepage, clearly and specifically state:

  • What your product is (in the first paragraph)
  • Who it's for
  • What problem it solves
  • How it's different from alternatives

Use your product name explicitly. Be specific — "AI-powered music coaching platform that analyzes vocal performances" is better than "a tool for musicians."


12. Create FAQ content

Priority: Medium

The problem: AI systems extract Q&A pairs from FAQ content. Without it, you miss citation opportunities.

Fix it: Add a FAQ section to your homepage or create a dedicated FAQ page. Structure it clearly:

<h2>Frequently Asked Questions</h2>
<h3>What is [Your Product]?</h3>
<p>[Direct answer]</p>
<h3>How does [Your Product] work?</h3>
<p>[Direct answer]</p>

Add FAQPage schema for these sections. Focus on questions people would actually ask ChatGPT or Perplexity about your category.


Priority 4: Performance (polish these last)

13. Optimize images

Priority: Low-Medium

The problem: AI builders sometimes use unoptimized images — large files, wrong formats, no lazy loading.

Check it: Run PageSpeed Insights and check the image optimization suggestions.

Fix it:

  • Convert to WebP format
  • Resize to appropriate dimensions
  • Add loading="lazy" to images below the fold
  • Use descriptive filenames (no spaces or special characters)
  • Add meaningful alt text

14. Minimize JavaScript bundle size

Priority: Low-Medium

The problem: Large JavaScript bundles slow initial page load, which affects Core Web Vitals.

Check it: Check your bundle size in the browser's Network tab. Vite + React apps often produce 200KB–1MB+ bundles.

Fix it:

  • Enable code splitting (React.lazy + Suspense for route-based splitting)
  • Remove unused dependencies
  • Use tree-shaking-friendly imports
  • Enable gzip/brotli compression on your server/CDN

15. Set up Google Search Console

Priority: Low (but do it early)

The problem: Without GSC, you have no visibility into how Google sees your site.

Fix it:

  1. Go to search.google.com/search-console
  2. Add and verify your domain
  3. Submit your sitemap
  4. Check for indexing issues

Review weekly for the first month, then monthly. GSC data takes 2–3 days to appear and provides invaluable insight into your search performance.


The complete checklist

#ItemPriorityTime to Fix
1Serve rendered HTML to crawlersCritical1 hour
2Configure robots.txt for AI crawlersCritical5 minutes
3Generate and submit XML sitemapCritical30 minutes
4Set unique title tagsHigh30 minutes
5Write meta descriptionsHigh30 minutes
6Set canonical URLsHigh15 minutes
7Add Open Graph tagsHigh30 minutes
8One H1 per pageMedium15 minutes
9Proper heading hierarchyMedium30 minutes
10Add JSON-LD structured dataMedium1 hour
11Clear entity descriptionsMedium1 hour
12Create FAQ contentMedium1–2 hours
13Optimize imagesLow-Medium1 hour
14Minimize JS bundleLow-Medium1–2 hours
15Set up Google Search ConsoleLow15 minutes

Total estimated time: 8–12 hours for all 15 items. The critical items (1–3) take under 2 hours and deliver 80% of the impact.


Start with an audit

Before working through this checklist manually, run a CrawlReady audit on your site. In 15 seconds, it identifies which of these issues affect you — so you can prioritize based on your actual gaps, not guesswork.

The audit checks: visibility gap, meta tags, headings, structured data, robots.txt, AI crawler accessibility, and more. It's free and gives you a clear action plan.

Run your free audit now


This checklist applies to sites generated by Lovable, Bolt.new, Base44, Replit, and any other tool that produces client-side rendered JavaScript applications. Last updated March 2026.

Run a free audit and see exactly what Google, ChatGPT, Perplexity, and 20+ crawlers see on your site. Results in 15 seconds.

Run Free Audit
Share:PostShare
#checklist#lovable#bolt#base44#seo#ai-builder#technical-seo