Free tool

Free Robots.txt AI Bot Checker

Paste your URL and instantly see which AI crawlers can access your site. We read your live robots.txt and check 15+ bots — GPTBot, ClaudeBot, PerplexityBot, and more — so you know if you're accidentally blocking AI traffic.

Analyze your robots.txt

Technical SEO

Why AI bots matter

Crawler access in robots.txt is the first gate for training and retrieval use cases.

GPTBot & ChatGPT

OpenAI’s crawlers (including GPTBot and related user agents) collect public pages for ChatGPT browsing and model-related use. If they’re disallowed, ChatGPT may be less likely to rely on your site as a source.

ClaudeBot

Anthropic uses ClaudeBot and Anthropic-AI to retrieve web content for Claude. Your robots.txt is the first gate for whether those agents may fetch your pages at all.

PerplexityBot

PerplexityBot powers Perplexity’s live answers and citations. Blocking it can remove you from those answer panels, sometimes intentionally, sometimes by accident alongside a broad User-agent: * rule.

This tool only interprets explicit User-agent lines that match each bot’s token. If a bot isn’t listed in your file, we show “Not specified”; you may still inherit rules from other blocks depending on crawler behavior.

FAQ

Robots.txt AI Bot Checker FAQ

What are AI bots and why do they crawl my website?
AI bots (like GPTBot, ClaudeBot, and PerplexityBot) are automated crawlers sent by AI companies to index web content. This data helps train AI models and provide real-time answers. If your robots.txt blocks these bots, AI assistants may not be able to reference or recommend your content when users ask relevant questions.
Which AI bots should I allow in my robots.txt?
At minimum, allow GPTBot (ChatGPT), ClaudeBot (Claude by Anthropic), PerplexityBot, and Google-Extended (Gemini). These cover the major AI platforms where buyers discover brands. Blocking them means your content won't be included in AI-generated recommendations, which is increasingly where purchase decisions start.
How do I unblock AI bots in my robots.txt?
Open your robots.txt file (usually at yoursite.com/robots.txt) and remove any Disallow rules for AI bot user agents. For example, if you see "User-agent: GPTBot / Disallow: /", change the Disallow line to "Allow: /" or remove the block entirely. If the bot isn't mentioned at all, most crawlers will assume access is allowed.
Will allowing AI bots hurt my website performance or security?
No. AI bots behave similarly to search engine crawlers like Googlebot — they read your public pages but don't modify anything. Their crawl rate is typically low and won't impact server performance. Allowing them simply ensures your content can be referenced when AI assistants answer relevant questions.
What does "Not specified" mean in the results?
If a bot shows as "Not specified," it means your robots.txt doesn't explicitly mention that bot's user agent. In most cases, this means the bot is allowed to crawl by default. However, a wildcard "Disallow: /" rule for all user agents would still block it. Our tool checks for both specific and wildcard rules.

Still have questions? Contact our team

Go beyond a one-off robots check

Monitor AI visibility over time, compare to competitors, and act on clear recommendations, not just a single static file.