🍋
Menu
🍋 Developer Tools

Robots.txt Generator

Create robots.txt files with AI crawler blocking options

Free Browser-only No sign-up
chars words sentences lines

robots.txt Options

Results

Result

About Robots.txt Generator

Generate robots.txt files for your website with easy-to-use options. Include rules for search engine crawlers, AI bots, and custom paths. Supports sitemap URL declaration and crawl delay.

How It Works

Configure your robots.txt rules through simple options. Toggle allow/disallow for common paths, add sitemap URLs, set crawl delays, and optionally block known AI crawlers.

Step by Step

  1. 1 Choose to allow all or selectively disallow paths
  2. 2 Add paths to block (e.g., /admin/, /api/)
  3. 3 Optionally add your sitemap URL
  4. 4 Set crawl delay if needed
  5. 5 Copy the generated robots.txt content

Tips

  • Always include a Sitemap directive for better crawling
  • Use Crawl-delay sparingly — most major crawlers ignore it
  • Test your robots.txt with Google Search Console
  • The AI crawler block list includes GPTBot, CCBot, and others

Frequently Asked Questions

What AI crawlers can be blocked?
GPTBot (OpenAI), ChatGPT-User, CCBot (Common Crawl), Google-Extended, anthropic-ai, and several others.
Does robots.txt actually prevent crawling?
It's a directive, not enforcement. Well-behaved bots respect it, but malicious crawlers may ignore it.