About CrawlProof
CrawlProof was built because the SEO toolchain is optimized for blue links, and the AEO toolchain barely exists. We answer one question: can an LLM actually understand and cite your site?
How it works
We fetch your site twice — once as plain HTML, once as a fully-rendered browser. We compare what an AI crawler can read against what a user sees. We check robots.txt, sitemap.xml, llms.txt, skill.md, structured data, and the rules your site exposes for GPTBot, ClaudeBot, PerplexityBot, and others. Then we score it.
Privacy
Audits run as CrawlProofBot/1.0. We never log in, submit forms, or POST. You can choose to discard raw HTML after a run — only structured findings will persist.