Makes sure that a robots.txt file exists.
This test attempts to fetch the "robots.txt" file stored at the root level of the website. This file is used by search engines to discover information about the website and control which pages appear in search results. If the robots.txt file doesn't exist, or if it is blank, most search engines will interpret this to mean that they can do whatever they want. If this is what you want, it is considered best practice to explicitly declare this by making a minimal robots.txt file with a blanket "Allow: /" rule.
Ready to validate your website for this test and 100+ others?