Robots.txt Generator
Create SEO-optimized robots.txt files with modern rules for search engines and AI crawlers
Configuration
Granular Crawler Control
Search Engines:
- Google - Googlebot
- Bing - Bingbot
- Yahoo - Slurp
- Yandex - YandexBot
- Baidu - Baiduspider
- DuckDuckGo - DuckDuckBot
Major AI Crawlers:
- ChatGPT - GPTBot
- SearchGPT - OAI-SearchBot
- Claude - Claude-Web
- Gemini - Gemini-Bot
- Meta AI - Meta-ExternalAgent
- Amazon - Amazonbot
- Apple - Applebot
- Common Crawl - CCBot
- Perplexity - PerplexityBot
- And 15+ more AI crawlers
Considerations:
- Full control over your content
- Intellectual property protection
- Server load management
- Maintain competitive advantage
- Specific choice per crawler
SEO Optimized
Modern rules for all major search engines
AI Aware
Control ChatGPT, Gemini and other AI crawlers
Security First
Blocks malicious and unwanted crawlers
Ready to Use
Copy or download your robots.txt instantly
Frequently Asked Questions
Robots.txt is a file that tells search engines and web crawlers which parts of your website they can or cannot access. It's essential for SEO control, preventing indexing of private pages, and managing crawler traffic to reduce server load.
It depends on your goals. Block AI crawlers if you want to protect your content from being used in AI training, maintain competitive advantage, or reduce server load. Allow them if you want your content to be discoverable through AI search and don't mind contributing to AI training datasets.
Place robots.txt in your website's root directory. It should be accessible at https://yoursite.com/robots.txt. The file must be named exactly "robots.txt" (lowercase) and be publicly accessible.
Update robots.txt whenever you add new sections to your site, change your privacy requirements, or when new AI crawlers emerge. With the rapid evolution of AI, checking quarterly for new crawlers is recommended.