Robots.txt Maker - Block Bots & Crawlers
Quickly create robots.txt files for WordPress, Shopify, Wix, or custom websites. Block unwanted bots and optimize crawl budget.
Using Robots.txt for Platform SEO
Different Content Management Systems (CMS) have different structures that require unique crawl strategies. A one-size-fits-all robots.txt file rarely works perfectly.
WordPress Robots.txt Best Practices
WordPress sites often have an `/admin` area that shouldn't be crawled. However, blocking the entire `wp-includes` folder can be dangerous because it often contains shared JavaScript libraries used by the frontend. The template provided by this tool unblocks specific assets inside blocked folders to ensure Google can render your page correctly.
Shopify Crawl Optimization
Shopify generates many duplicate URLs for products inside collections. Crawl budget is often wasted on sorted views (e.g., `?sort_by=price-ascending`). The Shopify template here specifically disallows these query parameters to keep Google focused on your primary product pages.
Blocking Bad Bots
While `User-agent: *` addresses all bots, you can target specific "bad" bots by name to explicitly block them. For example:
User-agent: GPTBot
Disallow: /
This would specifically block OpenAI's crawler from training on your content, while still allowing Google and Bing.
Testing Your Rules
After generating your file, it's critical to test it. Use the Google Search Console Robots.txt Tester. An incorrect syntax (like a missing colon or space) can invalidate the entire file.