Skip to content

AI Pages

Serve AI-optimized content to AI crawlers automatically - without changing your website.

8 min readUpdated Jan 11, 2026
What you'll learn
  • Understand why AI crawlers can't see most of your website
  • Learn how AI Pages makes your content visible to AI models
  • See the specific optimizations AI Pages applies
  • No changes to your existing website required

The invisible website problem

Here's something most marketers don't realize: AI crawlers probably can't see most of your website.

When ChatGPT, Claude, or Perplexity crawl the web to build their knowledge, they don't render JavaScript. They can't. They're processing billions of pages and don't have time to spin up a browser for each one.

This is a problem because modern websites are built with JavaScript frameworks. React, Vue, Next.js - they all render content dynamically. When you load nike.com in your browser, JavaScript fetches product data, builds the page, and displays it beautifully.

But when GPTBot visits that same URL? It sees something like this:

HTML
<div id="root"></div>
<script src="/bundle.js"></script>

That's it. No products. No descriptions. No content. Just an empty div waiting for JavaScript that will never run.

Your beautiful, content-rich website looks like a blank page to AI.


What this means for your brand

When someone asks ChatGPT "What are the best running shoes for marathons?", the model draws from its training data. If it never saw your product pages because they were JavaScript-rendered, you don't exist in its knowledge base.

The same pattern applies across AI applications:

  • AI search (Perplexity, ChatGPT Search) - Can't cite content it never crawled
  • AI assistants - Won't recommend products they don't know exist
  • AI agents - Can't complete tasks on sites they can't read
  • AI training - Your brand is excluded from future model knowledge

Every day you're invisible to AI crawlers is a day your competitors are being learned while you're not.


How AI Pages solves this

AI Pages is an intelligent proxy that sits between your website and AI crawlers. It does one thing extremely well: serve AI-optimized versions of your pages to AI crawlers, while keeping everything unchanged for human visitors.

1
Detect
2
Transform
3
Serve
4
Track

Detect - When a request comes in, AI Pages checks the user-agent. Human browser? Pass it through normally. AI crawler? Route it to our optimization layer.

Transform - For AI crawlers, AI Pages pre-renders your page, strips unnecessary code, adds structured data, and creates a clean, fact-dense version optimized for machine comprehension.

Serve - The optimized content is cached at the edge and served in ~100ms. AI crawlers get exactly what they need to understand and cite your content.

Track - Every crawler visit is logged. You see exactly when GPTBot, ClaudeBot, or PerplexityBot visit, which pages they access, and whether they got optimized content.


What makes it different

Traditional SEOAI Pages
Same content for everyoneDifferent formats for humans vs AI
Hopes crawlers render JavaScriptGuarantees crawlers see content
Static optimizationDynamic, AI-specific optimization
No visibility into AI crawlersFull analytics on crawler activity

This isn't cloaking. AI Pages serves the same content - just in a format AI can actually read. Like providing a transcript for a podcast. Same information, different delivery.


The five optimization features

AI Pages applies five AI-focused enhancements to every page. You can enable or disable each one individually.

1. Pre-rendering & compression

The foundation. AI Pages renders your JavaScript, captures the full HTML, strips analytics scripts, tracking pixels, and unnecessary code, then compresses everything into a clean, fast-loading version.

Result: AI crawlers see your actual content, not an empty page.

2. Structured data injection

AI Pages analyzes your page content and automatically generates JSON-LD schema markup. Product pages get Product schema. Blog posts get Article schema. FAQs get FAQPage schema.

Result: AI models can parse your content programmatically and understand context.

3. Key facts extraction

AI Pages identifies specific data points - prices, percentages, dates, statistics - and marks them explicitly. When ChatGPT needs a specific fact, it can find and cite yours.

Result: Your content becomes quotable. AI models can extract precise information.

4. Automated FAQ generation

AI Pages analyzes your content and generates relevant Q&A pairs based on what information you provide. AI systems prefer citing content that directly answers user questions.

Result: Your pages match the query format AI users actually ask.

5. Entity recognition

AI Pages identifies and marks every entity you reference - companies, technologies, people, products - helping AI understand the "who" and "what" in your content.

Result: Proper attribution. When AI mentions entities from your content, it can cite you.


Before vs After

Here's what a typical product page looks like to an AI crawler, before and after AI Pages:

BEFORE (raw HTML):

HTML
<div class="hero-banner animate-fade">
  <script>trackPageView('running-shoes');</script>
  <div class="grid gap-4 md:grid-cols-2">
    <img src="shoe.jpg" loading="lazy" class="transform hover:scale-105">
    <div class="flex flex-col justify-center">
      <h1 class="text-4xl font-bold">Nike Air Zoom Pegasus 40</h1>
      <div class="price">$130</div>
      <button onclick="addToCart()">Add to Cart</button>
    </div>
  </div>
  <!-- 500 more lines of nested divs, scripts, analytics... -->
</div>

AFTER (AI Pages-optimized):

HTML
<article itemscope itemtype="https://schema.org/Product">
  <h1 itemprop="name">Nike Air Zoom Pegasus 40</h1>
  <meta itemprop="brand" content="Nike">
  <meta itemprop="category" content="Running Shoes">
  
  <p itemprop="description">The Nike Air Zoom Pegasus 40 is a neutral 
  daily running shoe designed for runners seeking responsive cushioning 
  and reliable comfort across all distances.</p>
  
  <div itemprop="offers" itemscope itemtype="https://schema.org/Offer">
    <meta itemprop="price" content="130.00">
    <meta itemprop="priceCurrency" content="USD">
    <meta itemprop="availability" content="InStock">
  </div>
  
  <section>
    <h2>Key Features</h2>
    <ul>
      <li>Air Zoom unit in the forefoot for responsive cushioning</li>
      <li>10mm heel-to-toe drop for balanced ride</li>
      <li>Engineered mesh upper for breathability</li>
    </ul>
  </section>
  
  <section itemscope itemtype="https://schema.org/FAQPage">
    <h2>Frequently Asked Questions</h2>
    <div itemprop="mainEntity" itemtype="https://schema.org/Question">
      <h3 itemprop="name">Is the Pegasus 40 good for marathon training?</h3>
      <p itemprop="acceptedAnswer">Yes, the Pegasus 40 is a versatile daily 
      trainer suitable for marathon training, offering durability and 
      comfort for high mileage.</p>
    </div>
  </section>
</article>

The AI crawler now sees: Clean structure, rich schema markup, FAQ content that directly answers common questions, and zero JavaScript bloat. This is what gets cited.


Supported AI crawlers

AI Pages automatically detects and optimizes for all major AI crawlers:

ProviderCrawlersStatus
OpenAIGPTBot, ChatGPT-User, OAI-SearchBotEnabled by default
AnthropicClaudeBot, Claude-User, Claude-SearchBotEnabled by default
GoogleGoogle-Extended, Google-AgentEnabled by default
PerplexityPerplexityBotEnabled by default
MetaMeta-ExternalAgentEnabled by default
Coherecohere-aiEnabled by default
AppleApplebot-ExtendedEnabled by default
AmazonAmazonbotEnabled by default
ByteDanceByteSpiderOptional
BaiduBaiduspiderOptional

You can customize which crawlers receive optimized content in your AI Pages settings.


Requirements

RequirementDetails
WebsiteAny hosting platform
PlatformCloudflare, Vercel, Netlify, Next.js, CloudFront, WordPress, Node.js, Nginx, or any platform via Cloudflare DNS proxy
Trakkr planGrowth or Scale

AI Pages works with most hosting platforms out of the box. During setup, you'll choose your platform and get personalized integration code. Setup typically takes about 10 minutes.


Is AI Pages safe?

Yes. AI Pages has been designed with risk mitigation as a priority:

  • No SEO impact - AI Pages never serves different content to traditional SEO crawlers (Googlebot for search). Your rankings are unaffected.
  • Same content - AI Pages reformats your content for AI readability. It doesn't add fake information or change what you're saying.
  • Instantly reversible - Remove the AI Pages integration from your platform and it's completely gone. No lingering effects.
  • No site changes - Your actual website is never modified. AI Pages is a layer in front, not a change to your codebase.
Note
AI Pages is a first-mover technology. While it's been running reliably across many sites, you should understand how it works before enabling it. Read the Technical Details for the full architecture.

What you'll see after setup

Once AI Pages is live, you get visibility into how AI models interact with your content:

  • Crawler visit log - See exactly when GPTBot, ClaudeBot, and others visit
  • Page popularity - Which pages do AI crawlers access most?
  • Cache performance - Are optimized pages being served quickly?
  • Optimization status - Is each visit getting the enhanced content?

This visibility is valuable even without the optimization. Knowing when and how AI crawlers visit helps you understand the AI learning cycle.


Ready to get started?

Installation Guide

Set up AI Pages in under 30 minutes.

Usage & Billing

Understand costs and usage limits.

Technical Details

Deep dive into how AI Pages works.

Was this helpful?

Press ? for keyboard shortcuts