Robots.txt Generator - SpeedTool

Robots.txt Generator

Create SEO-friendly robots.txt files to control crawler access

🌐 Website Information

🤖 Crawler Control

Select which crawlers to configure:

🚫 Path Rules

No rules added yet. Click "Add Rule" to start.

⏱️ Crawl Delay (Optional)

Set crawl delay for specific bots (seconds between requests):

Googlebot seconds
Bingbot seconds
Yandex seconds

📄 robots.txt Preview

⚠️ Important: Place your robots.txt file in the root directory of your domain (e.g., https://example.com/robots.txt)

What is a Robots.txt File?

A robots.txt file is a text file located in your website's root directory that tells search engine crawlers which pages or files they can or cannot request from your site.

It's part of the Robots Exclusion Protocol (REP), a group of web standards that regulate how robots crawl and index the web.

While robots.txt is not a security measure (savvy users can still access disallowed pages), it's essential for:

  • Preventing duplicate content issues
  • Managing crawl budget
  • Keeping internal pages private
  • Directing crawlers to your sitemap

Basic Syntax

User-agent: *
Allow: /
Disallow: /admin/

Sitemap: https://example.com/sitemap.xml

Robots.txt Best Practices

Do

  • Place in root directory
  • Use consistent casing
  • Reference your sitemap
  • Test with Google Search Console

Don't

  • Block CSS or JS files
  • Use noindex as a substitute
  • Disallow entire site
  • Block search result pages
💡

Pro Tips

  • Wildcards (*) are supported
  • Use $ for end of URL
  • Crawl-delay is a suggestion only
  • Check logs for bot activity