Site icon Trinity

Google launches a new Robots.txt testing tool

Over the past week Google has been making some major updates to their webmaster capabilities, and one of the most noteworthy changes has come from the creation of a new robots.txt testing tool. In the past, Google would let a website know that URL’s were being blocked on their site, but they did not have an integrated tool that would point out the specific part of the Robots.txt file that was having an issue. Now users are able to test their robots file prior to officially submitting it, to see what, where, and why Google is being blocked from crawling that portion of the domain. What’s more, users will be able to try out new changes and test the results without publishing the updated version.

An added bonus? This new development will allow you to upload past files in order to help you solves all those previously unanswered riddles about former performance problems. Even though you may think your Robots.txt file is perfect, Google recommends checking for warnings and errors just in case. In SEO, information is power, and understanding how Google interprets your robots file can be a big key to your success.

Check it out at Google Webmaster Central