Robots.txt Tester

Fetch and parse any site's robots.txt, and test whether a path is crawlable.

What This Tool Does

Your robots.txt file tells search engine crawlers which parts of your site they may or may not request. A single wrong Disallow line can deindex an entire section of your site. This tool fetches the live robots.txt, parses every user-agent group, and lets you test whether a specific path is allowed for a specific crawler using Google's longest-match precedence rule.

Common Mistakes It Catches

Want the Full Picture?

The SEO Audit API checks crawlability plus 81 other signals and returns a scored report. Free tier available.