Free Robots.txt Generator

Create a robots.txt file to control search engine crawling

Configuration

Help search engines find your sitemap

Time to wait between requests (use only if needed)

Block Specific Paths

Common Paths:

Generated robots.txt

# robots.txt generated by Keywords Cluster

User-agent: *
Allow: /

Sitemap: https://yoursite.com/sitemap.xml

How to Use:

  1. Download or copy the generated robots.txt file
  2. Upload it to your website's root directory
  3. Make sure it's accessible at: yoursite.com/robots.txt
  4. Test it using Google Search Console's robots.txt Tester

Important:

Blocking pages in robots.txt doesn't prevent them from appearing in search results if they're linked from other sites. Use noindex meta tags for that purpose.

What is robots.txt?

The robots.txt file tells search engine crawlers which pages or sections of your site they can or cannot access. It's a simple text file placed in your website's root directory.

Common Use Cases:

  • Block admin areas and private sections
  • Prevent duplicate content from being indexed
  • Save crawl budget by blocking unimportant pages
  • Block search results and filter pages
  • Specify your sitemap location

Need Advanced SEO Tools?

Get site audits, keyword research, and technical SEO analysis

Start Free Trial