Tests each line of the robots.txt file to check for syntax errors.

If a robots.txt file has syntax errors, it can cause search engines to incorrectly crawl and index the website. This test will parse your robots.txt file and look for any errors. If you have invalid paths, duplicate rules, empty directives or many other kinds of errors, it will find them and report them for you to correct.