All free tools
🤖
Free tool

GEO Crawlability Checker

Check whether AI bots — GPTBot, PerplexityBot, ClaudeBot, GrokBot and more — can actually crawl your site. See exactly which bots are blocked and why, plus whether you have llms.txt and a sitemap.

Check any domain — results in seconds

About this tool

What is GEO crawlability and why does it matter?

GEO (Generative Engine Optimization) crawlability refers to whether AI systems — like ChatGPT, Perplexity, and Claude — can actually fetch and index your web pages. If a bot is blocked in your robots.txt, the AI platform using that crawler will never see your content, meaning it can't cite or recommend you.

Which AI bots does this checker test?

This tool checks for GPTBot (OpenAI/ChatGPT), PerplexityBot, ClaudeBot (Anthropic), GrokBot (xAI), Googlebot (which Gemini uses), CCBot (Common Crawl — used for training), Diffbot, Bytespider (ByteDance), meta-externalagent (Meta AI), and cohere-ai.

What should my robots.txt look like for maximum AI visibility?

For maximum AI visibility, either have no Disallow rules for AI bots, or explicitly add User-agent entries with Allow: / for each bot you want to admit. Avoid a blanket 'User-agent: * Disallow: /' unless you add explicit allow rules for each AI bot you want crawling your site.

What is llms.txt and should I have one?

llms.txt is an emerging standard (similar to robots.txt) that gives LLMs explicit guidance on what content on your site to prioritize, what to skip, and how to attribute your content. It's increasingly recognized by AI platforms. Use our llms.txt Generator tool to create one for your site.

Does being crawlable guarantee AI will cite my site?

Crawlability is the first prerequisite — if bots are blocked, nothing else matters. But being crawlable doesn't guarantee citations. Your content also needs to be well-structured, authoritative, and directly answer the questions being asked. Use the Content Readiness Checker and Schema Analyzer tools for the next steps.