Robots.txt File Validator

Learn how to make a robots.txt file to block certain bots from scanning your website. Validate your robots.txt file and enhance it.

Open Graph Meta Tags Validator

A robots.txt file is a plain text file that is placed at the root level of a website. Its purpose is to ask bots, spiders and crawlers to leave the website (or parts of the website) alone. You may want to do this if you think your web server will be overwhelmed by requests from a bot crawling your website. To learn more about the inner workings of this file, read our Robots.txt Explanation article.

Import Existing Robots.txt File

Upload or paste your robots.txt file to validate it
OR

Want to block AI bots?

AI bots will slurp up all the information on your website and use it to train Large Language Models (LLM). These LLMs can be used to generate new text and images based upon your content. If you believe that this constitutes plagiarism then you may want to prevent it. Copy and paste the lines below into your robots.txt file to block the most common AI bots.

User-agent: anthropic-ai
User-agent: Applebot-Extended
User-agent: Bytespider
User-agent: CCBot
User-agent: Claude-Web
User-agent: ClaudeBot
User-agent: cohere-ai
User-agent: Diffbot
User-agent: FacebookBot
User-agent: Google-Extended
User-agent: GPTBot
User-agent: Meta-ExternalAgent
User-agent: omgili
User-agent: Timpibot
Disallow: /
ValidBot can test your website

Test Your Website

Once you have made changes to your robots.txt file, enter your domain name into the box below and run a free ValidBot Test to see if everything is correct. Look in the "Common Files" section of the report to see if your robots.txt is working correctly.