AI Pages Analytics
Monitor AI crawler visits and understand how AI models access your content.
- See exactly when AI crawlers visit your pages
- Understand which content AI models prioritize
- Track cache performance and optimization success
- Connect crawler behavior to visibility improvements
AI Pages doesn't just optimize - it observes. Every AI crawler visit is logged, giving you unprecedented visibility into how AI models interact with your content.
The dashboard overview
When you open AI Pages → Dashboard, you'll see three key areas:
Requests served
A large counter showing:
- This month's requests vs your limit
- Progress bar showing usage percentage
- Percentage used badge
Hourly requests chart
A 24-hour sparkline showing request volume. Use this to spot:
- Peak crawler activity times
- Unusual spikes or drops
- Time zone patterns (many crawlers run on US schedules)
Response time chart
Average response time over 24 hours. Healthy AI Pages should show:
- Cache hits: ~70-100ms
- Cache misses: ~500-2000ms (first optimization)
Crawler breakdown
See which AI models visit your site most:
| Crawler | Company | What it powers |
|---|---|---|
| GPTBot | OpenAI | ChatGPT, GPT-4 training |
| ChatGPT-User | OpenAI | ChatGPT browsing feature |
| ClaudeBot | Anthropic | Claude training data |
| PerplexityBot | Perplexity | Perplexity search |
| Google-Extended | Gemini, AI Overviews | |
| Google-Agent | AI Agent actions |
Why this matters: If GPTBot visits frequently, your content is likely being ingested into OpenAI's systems. Track which models crawl you most to understand where you'll see visibility gains.
The crawl log
Every visit is recorded with details:
| Field | Description |
|---|---|
| Time | Exact timestamp (e.g., "2 minutes ago") |
| Crawler | Which AI crawler (GPTBot, ClaudeBot, etc.) |
| URL | The page that was accessed |
| Duration | Response time in milliseconds |
| Status | Cache Created, Cache Served, or Error |
Status meanings
- Cache Served ✓ - Optimized content served from cache (fast, ideal)
- Cache Created - First visit, page was optimized and cached
- Error - Something went wrong (check the logs)
Filtering the log
Use filters to find what you need:
- By crawler - See only GPTBot visits, for example
- By URL - See all visits to a specific page
- By date - Focus on a specific time range
- By status - Find errors or cache misses
Top pages
See which pages crawlers visit most:
| Rank | Page | Visits | Insight |
|---|---|---|---|
| 1 | /blog/ai-visibility-guide | 147 | High-value content |
| 2 | /products | 89 | Category pages popular |
| 3 | /about | 67 | Crawlers checking company info |
| 4 | / | 52 | Homepage always crawled |
How to use this:
- Pages with high visits are being indexed frequently - make sure they're optimized
- Pages with zero visits might have crawl issues - check robots.txt
- New pages appearing = crawlers discovering your content
Cache performance
Monitor how well caching is working:
Cache hit rate
Percentage of requests served from cache.
- 90%+ - Excellent (most requests are fast)
- 70-90% - Good (healthy cache performance)
- <70% - Investigate (too many cache misses)
Low hit rate could mean:
- Many new pages being crawled
- Cache expiring too often
- High crawler diversity (each model has separate caches)
Response times
| Type | Expected Time |
|---|---|
| Cache hit | 70-150ms |
| Cache miss | 500-2000ms |
| Error | N/A |
If response times are high, check:
- Your origin server performance
- Your integration's edge/server location
- Page complexity
Correlating with visibility
The key question: Is AI Pages improving your AI visibility?
What to look for
- 1Crawler visits → Visibility changes - After GPTBot visits increase, do your ChatGPT visibility scores improve?
- 1Page-level correlation - Pages that get crawled frequently should show up more in AI responses
- 1Timeline - There's a delay between crawling and visibility:
- Perplexity: Hours to days (real-time search) - ChatGPT: Days to weeks (model updates) - Claude: Weeks to months (training cycles)
Setting up alerts
Get notified about important crawler activity:
- 1Go to Settings → AI Pages → Alerts
- 2Enable the alerts you want:
| Alert | What it does |
|---|---|
| Crawler spike | Notifies when visits increase 50%+ |
| Error rate | Warns if errors exceed threshold |
| New crawler | Alerts when an unknown crawler visits |
| Usage threshold | Warns when approaching limits |
- 1Choose delivery: Email, Slack, or webhook
Exporting data
Download your analytics for reporting or analysis:
- CSV export - Full crawl log with all details
- API access - Query data programmatically
- Scheduled reports - Weekly email summaries
Go to AI Pages → Analytics → Export to download.
Best practices
Check weekly. Review crawler activity and cache performance regularly.
Watch for patterns. If certain pages get crawled repeatedly, that signals AI interest. Make those pages stellar.
Investigate errors. Even a small error rate can mean pages aren't being optimized. Fix root causes.
Compare to competitors. If your competitor's content shows up in AI and yours doesn't, check if they're getting more crawler visits.
Next steps
Technical Details
Understand exactly how AI Pages optimizes pages.
Troubleshooting
Fix common AI Pages issues.
Was this helpful?
