Free Robots.txt Generator

Generate a clean, SEO-friendly robots.txt file with support for multiple user-agents, allow/disallow rules, crawl-delay, sitemaps, and live URL testing — all in your browser.

Security Headers Analyzer

Inspect HTTP security headers and hardening hints.

Open

CSP Generator

Create strong Content-Security-Policy headers.

Open

Redirect Checker

Trace HTTP redirects and final destinations.

Open

Used only for context and sitemap suggestions — robots.txt itself never needs absolute URLs for rules.

Presets modify the User-agent: * block. You can always customize further below.

Start with * for all crawlers. You can add specific bots like Googlebot.

Not all crawlers respect this directive, but it can reduce server load for those that do.

Rules for selected user-agent

Directive Path
No rules yet. Add some below.

Robots.txt matching is prefix-based. For example, Disallow: /admin blocks /admin/settings as well.

One sitemap per line. These will be added as Sitemap: lines at the end of robots.txt.

robots.txt Preview

No site selected
Your robots.txt will appear here.

Adjust rules on the left and click “Generate robots.txt”.

URL Tester

No test yet.

Set up your rules, generate robots.txt, test URLs, and save profiles.