robots.txt Generator - Generate Crawler Configuration

Generate robots.txt needed for SEO. Just enter User-agent and Allow/Disallow rules and it's done.

Input
Preview
User-agent: *

How to Use

  1. Enter User-agent (default * targets all crawlers)
  2. Enter paths to disallow crawling, one per line
  3. Enter paths to allow, one per line (optional)
  4. Download button to get the robots.txt file

FAQ

What is robots.txt?
A text file that tells search engine crawlers the access rules for your site.
What happens if I set Disallow to /?
It blocks crawling of the entire site. Use this when you don't want search engines to index your site.
Is my data sent to a server?
No. All processing happens in your browser.
Can multiple crawlers be specified in User-agent?
Yes. By adding multiple User-agent directives, you can set different rules for each crawler. Use * (asterisk) to apply to all crawlers.
Can the Sitemap directive also be added?
Yes. Including your sitemap URL as a Sitemap directive in robots.txt tells crawlers where to find your sitemap.

Related Tools

Update History

Last Updated: 2026-02-21

  • 2026-02-21 Initial release