What it is
Crawl Analytics shows how search and AI crawlers interact with your site over the last 30 days.
Instead of guessing which pages or bots matter, you get concrete crawl activity, provider distribution, and per-path issue signals to guide SEO work.
How it works
LovableHTML records crawler requests
Crawler visits and served responses are aggregated for your connected domain.
Metrics are summarized into decision-friendly cards
You get top crawler providers, crawl volume, and crawl efficiency indicators.
Path-level crawl stats show where problems exist
Per-path activity includes issue badges and allows targeted follow-up actions.
You take action, then measure trend movement
Fix, re-render, and track whether crawl and issue metrics improve over time.
Key metrics and what to do with each
| Metric | What it tells you |
|---|---|
| Top Search Crawler | Which search bot is crawling most often |
| Top AI Crawler | Which AI crawler is most active |
| New Pages Crawled | Newly discovered URLs being visited |
| Total Crawls Served | Overall crawler demand for your content |
| Est. Crawl Budget Saved | Efficiency gained from fast, cache-friendly responses |
Reading Crawl Stats without noise
- Prioritize high-request paths first.
- Treat repeated crawler interest as a signal of page importance.
- Ignore obvious scanner/spam paths when they are unrelated to real content.
- Focus on issue badges attached to real business pages.
How to use it with SEO Spider
Use both features as a loop:
Find important paths in Crawl Analytics
Start from pages with high request volume or repeated bot interest.
Open SEO Spider and filter issue codes
Check which high-impact issues are affecting those same paths.
Fix and publish
Apply fixes in your stack, then publish to production.
Re-render and monitor 24 to 72 hour movement
Refresh affected pages and watch for better crawl consistency and issue reduction.
Copy prompt tip in context
When an issue set needs AI-assisted remediation, use this sequence:
Paste the prompt in your AI website builder.
Publish the changes on the correct domain.
Re-render affected pages or the entire site.
Then review Crawl Analytics and SEO Spider again to validate improvement.
