Skip to content

🤖 robots.txt Tester

Paste robots.txt and a URL path, and see whether Googlebot / GPTBot / ClaudeBot are allowed to crawl it. Correctly handles Allow / Disallow and longest-match rules.

✅ 100% free, no signup, browser-only

🤖 All major crawlers at once

🔗 Related Tools

📖 How to Use

  1. 1
    Paste your robots.txt content
    Paste the content of your robots.txt into the left textarea. Use the Sample button to try a typical robots.txt example.
  2. 2
    Enter the URL path and User-agent to test
    Enter a path such as /admin/ in the Test URL path field and select a User-agent (Googlebot, GPTBot, ClaudeBot, etc.) from the dropdown.
  3. 3
    Review the verdict and bulk crawler check
    The verdict (ALLOWED / DISALLOWED) and matched rule are shown for the selected User-agent. The bulk table at the bottom shows results for all major crawlers at once.

❓ FAQ

What if both Disallow: / and Allow: / are present?
Google uses the longest-match rule. If Allow and Disallow match the same length, Allow wins. This tool follows the same logic.
Is the Sitemap directive parsed?
This tool focuses on Allow / Disallow rule evaluation. Sitemap, Crawl-delay, and other directives are ignored.
Is User-agent matching case-sensitive?
This tool uses case-insensitive matching. Googlebot and googlebot are treated the same. Real Google crawlers also match case-insensitively.