Google launches a new Robots.txt testing tool

I hope you enjoy this post!  To get expert ideas into how to grow your business faster online, click here.

Craig Smith  |  Founder & CEO

Over the past week Google has been making some major updates to their webmaster capabilities, and one of the most noteworthy changes has come from the creation of a new robots.txt testing tool. In the past, Google would let a website know that URL’s were being blocked on their site, but they did not have an integrated tool that would point out the specific part of the Robots.txt file that was having an issue. Now users are able to test their robots file prior to officially submitting it, to see what, where, and why Google is being blocked from crawling that portion of the domain. What’s more, users will be able to try out new changes and test the results without publishing the updated version.

An added bonus? This new development will allow you to upload past files in order to help you solves all those previously unanswered riddles about former performance problems. Even though you may think your Robots.txt file is perfect, Google recommends checking for warnings and errors just in case. In SEO, information is power, and understanding how Google interprets your robots file can be a big key to your success.

Check it out at Google Webmaster Central

Table of Contents

The First Step to Fast SEO Growth

Outline the core pillars to your technical and content SEO strategies, and take your SEO to the next level with our free SEO Playbook (PDF).

About Trinity

Helping online brands grow since 2006. How can we help you? Find out today!

Get a FREE website diagnostic consultation and report for your site.

FREE WEBINAR

How to Scale SEO Work NOBODY Wants to Do to Grow Traffic