Skip to content

AI Pages Analytics

Monitor AI crawler visits and understand how AI models access your content.

5 min readUpdated Jan 11, 2026
What you'll learn
  • See exactly when AI crawlers visit your pages
  • Understand which content AI models prioritize
  • Track cache performance and optimization success
  • Connect crawler behavior to visibility improvements

AI Pages doesn't just optimize - it observes. Every AI crawler visit is logged, giving you unprecedented visibility into how AI models interact with your content.


The dashboard overview

When you open AI Pages → Dashboard, you'll see three key areas:

Requests served

A large counter showing:

  • This month's requests vs your limit
  • Progress bar showing usage percentage
  • Percentage used badge

Hourly requests chart

A 24-hour sparkline showing request volume. Use this to spot:

  • Peak crawler activity times
  • Unusual spikes or drops
  • Time zone patterns (many crawlers run on US schedules)

Response time chart

Average response time over 24 hours. Healthy AI Pages should show:

  • Cache hits: ~70-100ms
  • Cache misses: ~500-2000ms (first optimization)

Crawler breakdown

See which AI models visit your site most:

CrawlerCompanyWhat it powers
GPTBotOpenAIChatGPT, GPT-4 training
ChatGPT-UserOpenAIChatGPT browsing feature
ClaudeBotAnthropicClaude training data
PerplexityBotPerplexityPerplexity search
Google-ExtendedGoogleGemini, AI Overviews
Google-AgentGoogleAI Agent actions

Why this matters: If GPTBot visits frequently, your content is likely being ingested into OpenAI's systems. Track which models crawl you most to understand where you'll see visibility gains.


The crawl log

Every visit is recorded with details:

FieldDescription
TimeExact timestamp (e.g., "2 minutes ago")
CrawlerWhich AI crawler (GPTBot, ClaudeBot, etc.)
URLThe page that was accessed
DurationResponse time in milliseconds
StatusCache Created, Cache Served, or Error

Status meanings

  • Cache Served ✓ - Optimized content served from cache (fast, ideal)
  • Cache Created - First visit, page was optimized and cached
  • Error - Something went wrong (check the logs)

Filtering the log

Use filters to find what you need:

  • By crawler - See only GPTBot visits, for example
  • By URL - See all visits to a specific page
  • By date - Focus on a specific time range
  • By status - Find errors or cache misses

Top pages

See which pages crawlers visit most:

RankPageVisitsInsight
1/blog/ai-visibility-guide147High-value content
2/products89Category pages popular
3/about67Crawlers checking company info
4/52Homepage always crawled

How to use this:

  • Pages with high visits are being indexed frequently - make sure they're optimized
  • Pages with zero visits might have crawl issues - check robots.txt
  • New pages appearing = crawlers discovering your content

Cache performance

Monitor how well caching is working:

Cache hit rate

Percentage of requests served from cache.

  • 90%+ - Excellent (most requests are fast)
  • 70-90% - Good (healthy cache performance)
  • <70% - Investigate (too many cache misses)

Low hit rate could mean:

  • Many new pages being crawled
  • Cache expiring too often
  • High crawler diversity (each model has separate caches)

Response times

TypeExpected Time
Cache hit70-150ms
Cache miss500-2000ms
ErrorN/A

If response times are high, check:

  • Your origin server performance
  • Your integration's edge/server location
  • Page complexity

Correlating with visibility

The key question: Is AI Pages improving your AI visibility?

What to look for

  1. 1Crawler visits → Visibility changes - After GPTBot visits increase, do your ChatGPT visibility scores improve?
  1. 1Page-level correlation - Pages that get crawled frequently should show up more in AI responses
  1. 1Timeline - There's a delay between crawling and visibility:

- Perplexity: Hours to days (real-time search) - ChatGPT: Days to weeks (model updates) - Claude: Weeks to months (training cycles)

Note
Be patient. AI model updates aren't instant. Track trends over weeks and months, not days. A single crawl doesn't guarantee immediate visibility improvement.

Setting up alerts

Get notified about important crawler activity:

  1. 1Go to Settings → AI Pages → Alerts
  2. 2Enable the alerts you want:
AlertWhat it does
Crawler spikeNotifies when visits increase 50%+
Error rateWarns if errors exceed threshold
New crawlerAlerts when an unknown crawler visits
Usage thresholdWarns when approaching limits
  1. 1Choose delivery: Email, Slack, or webhook

Exporting data

Download your analytics for reporting or analysis:

  • CSV export - Full crawl log with all details
  • API access - Query data programmatically
  • Scheduled reports - Weekly email summaries

Go to AI Pages → Analytics → Export to download.


Best practices

Check weekly. Review crawler activity and cache performance regularly.

Watch for patterns. If certain pages get crawled repeatedly, that signals AI interest. Make those pages stellar.

Investigate errors. Even a small error rate can mean pages aren't being optimized. Fix root causes.

Compare to competitors. If your competitor's content shows up in AI and yours doesn't, check if they're getting more crawler visits.


Next steps

Technical Details

Understand exactly how AI Pages optimizes pages.

Troubleshooting

Fix common AI Pages issues.

Was this helpful?

Press ? for keyboard shortcuts