CrawlProof
View raw .md

AEO Audit for kootto.com

Target: https://kootto.com/
Score: 64 / 100
Generated: 2026-05-14T13:56:23.386Z
Pages crawled: 9
Findings: 40 pass · 24 warn · 0 fail · 0 unknown


1. Crawl Summary

2. Data Found

Data PointFound?SourceNotes
PricingNo
Customer logosYesHomepagetrusted by
Social proofYesHomepagetrusted by
Recent launchesYesPress/news pageshttps://kootto.com/en-AU/blog
Blog post activityYesBloghttps://kootto.com/en-AU/blog
New hiresNoOften only on a /blog/team or LinkedIn page
Headline copyYesHomepageRestaurant Quality Dining, In Your Home
PositioningNo
Executive teamNo
Product/service descriptionsYesHomepageFrom meta description
Case studies or testimonialsNo
Contact/demo/signup pathsYesFooter links

3. Homepage Audit

  • Homepage fetched successfully HTTP 200 · 84980 bytes · 202ms
  • Page load time: 0.20s Fast — well within AI crawler budgets.
  • declared
  • Single H1 Restaurant Quality Dining, In Your Home
  • <title> present (55 chars)
  • Meta description present (150 chars)
  • Canonical present https://kootto.com/
  • Open Graph tags complete
  • Twitter Card tags complete
  • Critical content is server-rendered Raw and rendered text are within 20% of each other.
  • Alt text coverage: 100% 3/3 images have alt text.
  • Content volume: 878 words Substantive content — AI models have enough to summarize and recommend.
  • Heading structure: 11 (h1:1, h2:6, h3:4) Multiple headings help AI chunk and outline your page.
  • Internal links: 25 25 internal + 3 external links help crawlers navigate.

4. Schema / Structured Data Audit

  • ⚠️ WebSite missing Adding WebSite JSON-LD helps LLMs identify your entity.
  • ⚠️ SoftwareApplication missing Adding SoftwareApplication JSON-LD helps LLMs identify your entity.
  • 4 JSON-LD block(s) found Types: FAQPage, Service, Organization, VideoObject
  • Organization present
  • FAQPage JSON-LD present

5. robots.txt and sitemap.xml Audit

  • robots.txt present 457 chars
  • robots.txt references sitemap(s)
  • sitemap.xml present (29 URLs)

6. LLM / AI Crawler Accessibility

  • ⚠️ Google-Extended not explicitly addressed No User-agent: Google-Extended block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • ⚠️ Applebot-Extended not explicitly addressed No User-agent: Applebot-Extended block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • ⚠️ CCBot not explicitly addressed No User-agent: CCBot block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • GPTBot has explicit rules An explicit User-agent block exists. Make sure it allows the paths you want indexed.
  • ClaudeBot has explicit rules An explicit User-agent block exists. Make sure it allows the paths you want indexed.
  • PerplexityBot has explicit rules An explicit User-agent block exists. Make sure it allows the paths you want indexed.
  • OAI-SearchBot has explicit rules An explicit User-agent block exists. Make sure it allows the paths you want indexed.
  • llms.txt present 6823 chars
  • skill.md present
  • /.well-known/ai-plugin.json present
  • /.well-known/security.txt present Security contact published — builds trust with crawlers and security researchers.

7. Positioning Clarity

  • ⚠️ No pricing/plans link found AI summaries commonly include pricing. Add a /pricing page even if pricing is custom.
  • About/Team path discoverable
  • H1 communicates value Restaurant Quality Dining, In Your Home
  • Value-prop language detected
  • Contact / signup path discoverable

8. Missing or Hard-to-Find Information

  • ⚠️ 5 data point(s) could not be found from public pages · Pricing · New hires · Positioning · Executive team · Case studies or testimonials
  • ⚠️ Add a /pricing page Even contact-us pricing benefits from a /pricing page that LLMs can link to in answers.
  • ⚠️ Allow Google-Extended in robots.txt Add an explicit User-agent: Google-Extended Allow: / block so this AI crawler can read your site.
  • ⚠️ Allow Applebot-Extended in robots.txt Add an explicit User-agent: Applebot-Extended Allow: / block so this AI crawler can read your site.
  • ⚠️ Allow CCBot in robots.txt Add an explicit User-agent: CCBot Allow: / block so this AI crawler can read your site.
  • ⚠️ Add WebSite JSON-LD Helps engines understand the root site and enables sitelinks-search-box features.
  • ⚠️ Add Product / SoftwareApplication JSON-LD On /pricing and feature pages — include offers, name, applicationCategory.

10. Priority To-Do List

  • P2 — Add a /pricing page Even contact-us pricing benefits from a /pricing page that LLMs can link to in answers.
  • P3 — Allow Google-Extended in robots.txt Add an explicit User-agent: Google-Extended Allow: / block so this AI crawler can read your site.
  • P3 — Allow Applebot-Extended in robots.txt Add an explicit User-agent: Applebot-Extended Allow: / block so this AI crawler can read your site.
  • P3 — Allow CCBot in robots.txt Add an explicit User-agent: CCBot Allow: / block so this AI crawler can read your site.
  • P3 — Add WebSite JSON-LD Helps engines understand the root site and enables sitelinks-search-box features.
  • P3 — Add Product / SoftwareApplication JSON-LD On /pricing and feature pages — include offers, name, applicationCategory.

Report by CrawlProof. Reusable after every major website change.