Robots.txt Generator
Build your robots.txt file visually. Configure crawl rules for search engines and AI bots with quick presets and common path suggestions.
Generated robots.txt
User-agent: *
Disallow:Quick Presets
Rule 1
Additional Directives
Related Articles
Learn more about this topic
Everything you need to know
Visual Robots.txt Editor
Create your robots.txt file without memorizing syntax. This visual editor lets you add multiple user-agent rules, disallow and allow paths, set crawl delays, and include your sitemap URL — all with a clean interface and instant preview.
Quick Presets
- Allow All Crawling — Lets all bots access your entire site (empty Disallow).
- Block All Crawling — Prevents all bots from accessing any page (Disallow: /).
- Block AI Bots Only — Allows search engines but blocks AI training crawlers like GPTBot, ChatGPT-User, CCBot, and others.
- Standard Website — Blocks common private paths like /admin/, /api/, /login while allowing everything else.
Robots.txt Syntax
Each rule block starts with a User-agent line specifying which bot the rules apply to. Use * for all bots or specific names like Googlebot. Disallow lines block paths, while Allow lines can override disallow rules for specific sub-paths.
AI Bot Blocking
With the rise of AI training crawlers, many website owners want to prevent their content from being used for AI training. This tool includes a dedicated preset for blocking major AI bots including GPTBot (OpenAI), Google-Extended (Google AI), CCBot (Common Crawl), and anthropic-ai.
Related Tools
Optimize your content with the Keyword Density Checker, generate meta tags with the Meta Tag Generator, or build campaign URLs with the UTM Link Builder.