AI Crawler Check
Free Bot Analysis Tool

Robots.txt Validator

Paste your robots.txt or enter a URL to analyze it against our database of 160+ bots. See which crawlers are blocked, allowed, or partially restricted — with SEO impact analysis.

Frequently Asked Questions

What does the Robots.txt Validator check?
Our validator parses your robots.txt file and checks each rule against 160+ known bot user-agents across 8 categories: AI bots, search engines, Google bots, SEO tools, social media bots, data scrapers, cloud services, and other agents. It shows you exactly which bots are blocked, allowed, or partially restricted.
What is the SEO Safety Score?
The SEO Safety Score (0-100) measures how your robots.txt impacts search engine visibility. Blocking critical search engines (Googlebot, Bingbot) drastically reduces your score. Blocking AI bots and scrapers has zero negative impact. A score of 100 means all search engine bots are allowed.
How do I fix a robots.txt that blocks search engines?
If your robots.txt blocks Googlebot, Bingbot, or other search engine crawlers, use our Robots.txt Generator to create a new one. Choose the "SEO Optimized" preset which blocks AI bots and scrapers while keeping all search engines allowed.
Can I validate robots.txt for any website?
Yes! Enter any public URL and we will fetch its robots.txt file automatically. Alternatively, you can paste robots.txt content directly if you have it locally. The analysis is instant and covers all 160+ bots in our database.
What's the difference between blocked and partially blocked?
Blocked means the bot is completely denied access to all paths (Disallow: /). Partially blocked means some paths are allowed while others are restricted — for example, allowing the homepage but blocking /admin/ or /api/ paths. Both are reflected in the analysis.