Create a robots.txt file to control search engine crawling
Help search engines find your sitemap
Time to wait between requests (use only if needed)
Common Paths:
# robots.txt generated by Keywords Cluster User-agent: * Allow: / Sitemap: https://yoursite.com/sitemap.xml
Important:
Blocking pages in robots.txt doesn't prevent them from appearing in search results if they're linked from other sites. Use noindex meta tags for that purpose.
The robots.txt file tells search engine crawlers which pages or sections of your site they can or cannot access. It's a simple text file placed in your website's root directory.
Get site audits, keyword research, and technical SEO analysis
Start Free Trial