CrawlProof
View raw .md

AEO Audit for ugig.net

Target: https://ugig.net/
Score: 43 / 100
Generated: 2026-05-13T02:58:29.021Z
Pages crawled: 9
Findings: 21 pass · 39 warn · 2 fail · 0 unknown


1. Crawl Summary

2. Data Found

Data PointFound?SourceNotes
PricingNo
Customer logosNo
Social proofNo
Recent launchesNo
Blog post activityNo
New hiresNoOften only on a /blog/team or LinkedIn page
Headline copyYesHomepageThe Marketplace for AI-Powered Professionals and AI Agents
PositioningYesHomepagebuilt for
Executive teamYesAbout/team pagehttps://ugig.net/about
Product/service descriptionsYesHomepageFrom meta description
Case studies or testimonialsNo
Contact/demo/signup pathsYesNavigation links

3. Homepage Audit

  • ⚠️ Missing canonical link Add <link rel="canonical" href="https://your-domain"> to prevent dup-content confusion.
  • Homepage fetched successfully HTTP 200 · 84378 bytes · 124ms
  • Single H1 The Marketplace for AI-Powered Professionals and AI Agents
  • <title> present
  • Meta description present
  • Open Graph tags complete
  • Critical content is server-rendered Raw and rendered text are within 6% of each other.
  • Alt text coverage: 100% 3/3 images have alt text.

4. Schema / Structured Data Audit

  • No JSON-LD structured data found Add JSON-LD blocks (Organization, SoftwareApplication, FAQPage, BreadcrumbList) so AI answer engines can ingest your data without guessing.

5. robots.txt and sitemap.xml Audit

  • robots.txt present 171 chars
  • robots.txt references sitemap(s)
  • sitemap.xml present (1537 URLs)

6. LLM / AI Crawler Accessibility

  • ⚠️ llms.txt missing Add /llms.txt — a concise, link-rich summary that helps LLMs orient on your site.
  • ⚠️ GPTBot not explicitly addressed No User-agent: GPTBot block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • ⚠️ ClaudeBot not explicitly addressed No User-agent: ClaudeBot block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • ⚠️ PerplexityBot not explicitly addressed No User-agent: PerplexityBot block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • ⚠️ Google-Extended not explicitly addressed No User-agent: Google-Extended block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • ⚠️ OAI-SearchBot not explicitly addressed No User-agent: OAI-SearchBot block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • ⚠️ Applebot-Extended not explicitly addressed No User-agent: Applebot-Extended block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • ⚠️ CCBot not explicitly addressed No User-agent: CCBot block in robots.txt. We recommend explicit Allow rules so crawlers don't fall back to defaults.
  • skill.md present

7. Positioning Clarity

  • ⚠️ No pricing/plans link found AI summaries commonly include pricing. Add a /pricing page even if pricing is custom.
  • About/Team path discoverable
  • H1 communicates value The Marketplace for AI-Powered Professionals and AI Agents
  • Value-prop language detected
  • Contact / signup path discoverable

8. Missing or Hard-to-Find Information

  • 7 data point(s) could not be found from public pages · Pricing · Customer logos · Social proof · Recent launches · Blog post activity · New hires · Case studies or testimonials
  • ⚠️ Add JSON-LD structured data Start with Organization on the root layout and SoftwareApplication or Product on /pricing. Add FAQPage on any FAQ section.
  • ⚠️ Add /llms.txt A short Markdown-flavored summary at the root. Include your H1, value prop, top 5–10 links, and pricing summary.
  • ⚠️ Add a /pricing page Even contact-us pricing benefits from a /pricing page that LLMs can link to in answers.
  • ⚠️ Add a canonical link <link rel="canonical" href="https://yoursite.com/"> on every page to prevent dup-content drift.
  • ⚠️ Allow GPTBot in robots.txt Add an explicit User-agent: GPTBot Allow: / block so this AI crawler can read your site.
  • ⚠️ Allow ClaudeBot in robots.txt Add an explicit User-agent: ClaudeBot Allow: / block so this AI crawler can read your site.
  • ⚠️ Allow PerplexityBot in robots.txt Add an explicit User-agent: PerplexityBot Allow: / block so this AI crawler can read your site.
  • ⚠️ Allow Google-Extended in robots.txt Add an explicit User-agent: Google-Extended Allow: / block so this AI crawler can read your site.
  • ⚠️ Allow OAI-SearchBot in robots.txt Add an explicit User-agent: OAI-SearchBot Allow: / block so this AI crawler can read your site.
  • ⚠️ Allow Applebot-Extended in robots.txt Add an explicit User-agent: Applebot-Extended Allow: / block so this AI crawler can read your site.
  • ⚠️ Allow CCBot in robots.txt Add an explicit User-agent: CCBot Allow: / block so this AI crawler can read your site.

10. Priority To-Do Checklist

  • P1 — Add JSON-LD structured data Start with Organization on the root layout and SoftwareApplication or Product on /pricing. Add FAQPage on any FAQ section.
  • P2 — Add /llms.txt A short Markdown-flavored summary at the root. Include your H1, value prop, top 5–10 links, and pricing summary.
  • P2 — Add a /pricing page Even contact-us pricing benefits from a /pricing page that LLMs can link to in answers.
  • P3 — Add a canonical link <link rel="canonical" href="https://yoursite.com/"> on every page to prevent dup-content drift.
  • P3 — Allow GPTBot in robots.txt Add an explicit User-agent: GPTBot Allow: / block so this AI crawler can read your site.
  • P3 — Allow ClaudeBot in robots.txt Add an explicit User-agent: ClaudeBot Allow: / block so this AI crawler can read your site.
  • P3 — Allow PerplexityBot in robots.txt Add an explicit User-agent: PerplexityBot Allow: / block so this AI crawler can read your site.
  • P3 — Allow Google-Extended in robots.txt Add an explicit User-agent: Google-Extended Allow: / block so this AI crawler can read your site.
  • P3 — Allow OAI-SearchBot in robots.txt Add an explicit User-agent: OAI-SearchBot Allow: / block so this AI crawler can read your site.
  • P3 — Allow Applebot-Extended in robots.txt Add an explicit User-agent: Applebot-Extended Allow: / block so this AI crawler can read your site.
  • P3 — Allow CCBot in robots.txt Add an explicit User-agent: CCBot Allow: / block so this AI crawler can read your site.

Report by CrawlProof. Reusable after every major website change.