What is Robots.txt Testing Tool?The robots.txt tester tool provides you detailed information whether your current robots.txt file is blocking Google search crawlers from accessing any specific URLs on your site. To make it simpler, you can use this tool to test whether Google bot crawlers can crawl the URL of a page that you wish to block from Google search engine.
How to Test your Robots.txt File with Robots.txt Testing Tool:The very first thing you need to do is to login to Google Webmaster tool, then go to Robots.txt tester and from the list of your verified properties select the one which you would like to test.
Now you will see your current robots.text file, you can test different URLs to see if google crawlers are disallowed from crawling them or not. Type a URL in the text box present at the bottom of the page and press Test button.
Here the test button will either change to "ACCEPTED" or "BLOCKED", it depends whether the URL you enter has blocked the access of Google crawlers or not.
Make changes to the robots.txt file according to your needs and retest the file as much as needed until you are satisfied. Once you are done customizing and have finished writing new rules to the file, copy the whole code and paste it on your robots.txt file hosted on your site.
Note: This tool does not changes your robots.txt file, so you have to upload a fresh file on your own. This tool only tests the against the copy hosted in the tool
We have already written a tutorial on how to edit a Robots.txt file in blogger so please take a look at it if you don't know how to make changes in robots.txt file.
We hope this article may have answered your all queries regarding creating a perfect robots.txt file without prior technical knowledge as this tool makes a lot of things easier. Now it is simple to test, maintaining and be sure of your robots.txt file. If you have any questions, or need help feel free to leave a comment below or drop us a message at our contact us page.