Disallow All

Checks to make sure robots can crawl parts of the website.

If the robots.txt file contains the line Disallow: /, this means that search engines should not crawl or index any page of the website. This would prevent the website from appearing in search results. If you want to block a search engine from crawling certain parts of the website, it would be best to make more specific rules for disallowing certain paths.

To learn more about the robots.txt file standard, please check out our Robots.txt File Explanation article.

Ready to validate your website for this test and 100+ others?