Free Robots.txt Generator
Generate a clean, SEO-friendly robots.txt file with support for multiple user-agents, allow/disallow rules, crawl-delay, sitemaps, and live URL testing — all in your browser.
Used only for context and sitemap suggestions — robots.txt itself never needs absolute URLs for rules.
Presets modify the User-agent: * block. You can always customize further below.
Start with * for all crawlers. You can add specific bots like Googlebot.
Not all crawlers respect this directive, but it can reduce server load for those that do.
Rules for selected user-agent
| Directive | Path | |
|---|---|---|
| No rules yet. Add some below. | ||
Robots.txt matching is prefix-based. For example, Disallow: /admin blocks
/admin/settings as well.
One sitemap per line. These will be added as Sitemap: lines at the end of robots.txt.
robots.txt Preview
No site selectedYour robots.txt will appear here.
URL Tester
No test yet.
Saved Robots Profiles
Set up your rules, generate robots.txt, test URLs, and save profiles.