Optimize
Audit your website for AI visibility. Get actionable fixes that actually work.
- Understand the difference between SEO and AI optimization
- See exactly what AI crawlers look for when visiting your site
- Get platform-specific fixes for Shopify, WordPress, Next.js, and more
- Track your optimization progress over time
Here's something that surprises most people: a website that ranks #1 on Google can be completely invisible to AI models.
Traditional SEO and AI optimization overlap, but they're not the same game. Google's crawler has been around for decades - it understands JavaScript rendering, handles lazy loading, follows redirects gracefully. AI crawlers? They're newer, simpler, and often give up faster.
Optimize analyzes your site through the lens of what AI systems actually need to understand and cite your content. Then it tells you exactly what to fix, with code snippets for your specific platform.
Why AI optimization is different
When ChatGPT or Perplexity needs to answer a question, they're not ranking pages like Google. They're looking for content they can confidently extract facts from.
That second column is what Optimize focuses on. Your site might be beautiful, fast, and well-linked - but if the AI crawler sees a mess of JavaScript and no schema markup, you're invisible.
Running your first audit
- 1Go to Optimize from the sidebar
- 2Enter your website URL (or it auto-fills from your brand settings)
- 3Click Run Audit
- 4Watch the real-time progress as we crawl your site
The audit typically takes 30-90 seconds depending on your site size. You'll see each phase in real-time:
Phase 1: Crawl - We discover pages on your site via your sitemap and internal links. You'll see URLs streaming in as we find them.
Phase 2: Analyze - Each page gets checked for 17+ technical factors: meta tags, heading structure, schema markup, canonical tags, and more.
Phase 3: Deep Dive - Your most important pages (homepage, key product pages) get AI-specific analysis. We check if the content is actually extractable and citable.
Phase 4: Report - We compile everything into prioritized recommendations with platform-specific code.
The Overview tab
When your audit completes, you land on the Overview. This is your command center - everything important at a glance.
Your optimization score
The big number in the ring shows your overall AI-readiness from 0-100:
| Score | What it means |
|---|---|
| 80-100 | Excellent - Your site is AI-ready. Minor tweaks only. |
| 60-79 | Good - Solid foundation, but room to improve. |
| 40-59 | Needs Work - AI crawlers are struggling with your site. |
| 0-39 | Poor - Major issues preventing AI visibility. |
Below the score you'll see how you compare to the industry average. If you're at 72 and the average is 58, you're already ahead. If you're below average, that's your motivation.
Top priority
The biggest box shows your single most important fix. This is the issue where fixing it will have the biggest impact on your AI visibility. We show:
- What the issue is
- The estimated impact (e.g., "+15-25% citation potential")
- How long it typically takes to fix
- How many pages are affected
Click "View fix details" to get the full breakdown with code.
Issues breakdown
A quick count of issues by severity:
- Critical (red) - These are actively blocking AI crawlers. Fix immediately.
- High (amber) - Significant problems that hurt your visibility. High priority.
- Medium (gray) - Worth fixing, but not urgent.
AI Readiness checks
This is where Optimize gets interesting. We check for five files that AI systems increasingly rely on:
| File | What it is |
|---|---|
| robots.txt | The classic - tells crawlers what they can access. We check you're not accidentally blocking AI bots. |
| sitemap.xml | Helps AI systems discover all your pages. Essential for larger sites. |
| llms.txt | A new standard. Tells AI models about your site structure and key content. Like a README for AI. |
| mcp.json | Model Context Protocol - enables AI agents to interact with your site programmatically. Advanced but increasingly important. |
| security.txt | Shows you're a legitimate, trustworthy site. AI systems check for this. |
Most sites have robots.txt and sitemap. Few have llms.txt or mcp.json yet - but they're becoming the differentiators. We show you exactly how to add them.
The Pages tab
Click "Pages" to see every URL we analyzed. This is where you drill into specifics.
Filtering and sorting
The filter bar lets you slice the data:
- Page type - Homepage, product pages, blog posts, etc.
- Score range - Show only pages below 50, or only your best performers
- Deep analyzed only - Filter to pages that got the full AI analysis
- Search - Find a specific URL
Sort by score (ascending) to see your worst pages first - that's usually where to start.
What you see for each page
Each row shows:
- The URL path
- Page type (we auto-detect: homepage, product, blog, category, etc.)
- The technical score
- Key stats: word count, response time, HTTP status
Click any row to open the Page Detail Panel.
Page Detail Panel
This slide-out panel shows everything about a single page:
Meta information - Title, description, H1. We flag issues like "title too long" or "no H1 found."
Technical stats - Status code, response time, internal/external link counts.
Issues on this page - Every check that failed, specific to this URL.
Deep Analysis (if available) - This is the AI-specific stuff:
- Query coverage - What questions can this page answer? What's missing?
- Citation potential - Does this page have quotable facts? Or is it too vague?
- Entity clarity - Can AI clearly identify what this page is about?
The Issues tab
This is your fix-it list. Every issue found across your site, grouped and prioritized.
How issues are organized
Issues are grouped by type (e.g., "Missing meta description", "No schema markup") with a count of affected pages. Critical issues appear first, then high, then medium.
Each issue card shows:
- Title - What's wrong
- Severity - Critical, high, or medium
- Affected pages - How many URLs have this problem
- Affected % - What percentage of your crawled pages
Issue details
Click any issue to expand it. You'll see:
Why it matters - How this specific issue affects AI visibility. Not generic advice - we explain the actual mechanism.
How to fix it - Step-by-step instructions. For technical issues, we include code snippets.
Platform-specific fixes - If you're on Shopify, we show Shopify-specific code. WordPress gets WordPress code. Next.js gets Next.js code. We detect your platform automatically.
Affected URLs - The actual pages where this issue exists. Click any URL to jump to its detail panel.
Issue status tracking
You can mark issues as:
- Open - Not yet addressed
- In Progress - You're working on it
- Fixed - You've made the change
- Ignored - Not relevant to your situation
When you re-run an audit, we verify fixed issues. If they come back, we'll let you know.
The History tab
Track your optimization progress over time.
Score trend chart
See your overall score graphed over time. Did that schema markup addition last month actually help? The chart shows you.
Past audits
A list of every audit you've run, showing:
- Date
- Score at that time
- Whether it was a manual or scheduled audit
- Score change from the previous audit
Click any past audit to view it. You're looking at a snapshot of how your site looked at that point.
Compare audits
Select two audits to compare them side-by-side. This shows you:
- Score change
- Issues that were fixed
- New issues that appeared
- Improvement in specific categories
This is particularly useful after a site redesign or major content update.
Weekly scheduled audits
You don't have to remember to audit. Set it and forget it.
In the top-right, click Schedule. Toggle on weekly audits and we'll automatically run a full audit every Monday at 9am UTC.
You'll get:
- Fresh data every week
- Historical trending automatically
- Alerts if your score drops significantly
Think of it like a health check for your AI visibility.
Understanding AI-specific issues
Let's talk about some issues you'll see that are unique to AI optimization.
"AI crawlers may be blocked"
This happens when your robots.txt blocks user agents like GPTBot, Anthropic-AI, Google-Extended, or Google-Agent. You might have done this intentionally - some sites prefer not to train AI models. But if you want AI visibility, you need to allow these crawlers.
The fix: Add explicit allow rules for AI user agents, or remove blocking rules.
"No llms.txt file"
This is an emerging standard. The llms.txt file sits at your domain root (example.com/llms.txt) and tells AI models:
- What your site is about
- What your most important pages are
- How to navigate your content
- What topics you're authoritative on
Think of it like a README for AI. It's not required, but it's increasingly checked by AI systems when deciding what to trust.
The fix: We generate an llms.txt file for you based on your site structure. Copy-paste it to your root.
"Missing schema markup"
Schema markup (JSON-LD) is structured data that explicitly tells crawlers what your content means. Instead of inferring that "$99" is a price, schema declares: "This is a Product with price 99.00 USD."
AI models rely heavily on schema for accurate information extraction. Without it, they're guessing.
The fix: Add appropriate schema for your page type. We provide the exact JSON-LD code for your platform.
"Content not citable"
This is our AI deep-dive check. Some content is clear and factual: "The Nike Air Zoom Pegasus 40 weighs 10.2oz and retails for $130." Other content is vague: "Experience amazing comfort with our latest innovation."
AI models cite the first. They skip the second.
The fix: Add specific, factual statements. Include numbers, dates, names. Make your content quotable.
Platform-specific notes
Shopify
Most Shopify themes handle SEO basics well, but AI optimization is hit-or-miss. Common issues:
- Theme JSON-LD is often incomplete or wrong
- JavaScript-heavy collection pages
- Missing structured product data
We detect Shopify and give you Liquid code snippets that work.
WordPress
WordPress core is AI-friendly. Plugins are the wildcard. Common issues:
- Plugin conflicts blocking crawlers
- Missing schema (unless you're using Yoast or RankMath)
- Over-reliance on JavaScript widgets
We'll tell you which plugins to check and what to configure.
Next.js / React
JavaScript frameworks are the biggest AI visibility problem. Here's why: AI crawlers often don't execute JavaScript. They see your initial HTML - which for most React apps is an empty div.
If you're on Next.js, you need server-side rendering (SSR) or static generation (SSG). Client-side only? You're invisible.
We detect your rendering mode and flag issues accordingly.
Webflow / Wix / Squarespace
These platforms generally handle basics well because they control the output. Issues tend to be:
- Generic/incomplete schema
- Missing sitemap customization
- Limited control over robots.txt
We give you platform-specific instructions for what you can actually change.
Best practices
Start with your homepage
Your homepage is almost always your most crawled page. Get it right first.
Fix critical issues immediately
Critical issues (red) are usually blocking problems - AI crawlers literally can't access or understand your content. Fix these before anything else.
Prioritize high-traffic pages
A fix on a page that gets 10,000 visits matters more than a fix on a page that gets 10.
Re-audit after changes
Made fixes? Run a new audit within 48 hours to verify they worked. Your score should improve.
Set up weekly audits
Your site changes. Your competitors' sites change. AI systems evolve. Weekly audits catch problems before they hurt you.
What Optimize doesn't do
To set expectations:
- We don't crawl competitors - Optimize analyzes your site only. For competitive analysis, use the main Trakkr visibility tracking.
- We don't make changes for you - We tell you what to fix and give you the code, but you (or your developer) implement it.
- We don't guarantee citations - Optimization improves your chances, but AI systems decide what to cite based on many factors.
- We don't replace SEO - AI optimization and traditional SEO are complementary, not competitive. Good SEO helps. AI optimization adds a layer.
Next steps
AI Pages
Want to go further? AI Pages serves AI-optimized content automatically, without changing your main site.
Content Studio
Creating new content? Our editor helps you write AI-optimized articles from the start.
Back to Dashboard
Check your overall AI visibility scores across all models.
Was this helpful?
