Robots.txt Tool
Generate, analyze, and test robots.txt files to control search engine crawlers. Ensure your website is properly indexed while protecting sensitive areas.
All processing happens in your browser – no data leaves your computer!
Generator
robots.txt Content
Tester
Analysis
Robots.txt Best Practices
Use Specific User-agents When Needed
Target specific crawlers like Googlebot when you need different rules for different search engines.
User-agent: Googlebot
Disallow: /private/
User-agent: *
Disallow: /admin/
Disallow: /private/
User-agent: *
Disallow: /admin/
Be Careful With Wildcards
Using wildcards (*) can block more than you intend. Test your rules thoroughly.
Include Your Sitemap
Help search engines find your sitemap by including it in your robots.txt file.
Sitemap: https://example.com/sitemap.xml
Don’t Use Robots.txt for Sensitive Data
Robots.txt is publicly accessible. Use proper authentication for sensitive areas.