Site icon Trinity

Google Reveals 2 Major Reasons Why They Might Not Be Crawling Your Site

Before you can get your ideal customer’s eyes on your site, you need to get Google’s. Frequent check-ins by the mega search engine make for a better chance of ranking in results. However, Google’s crawler, GoogleBot, might be turning a blind eye to even your flashiest of pages if your site is experiencing certain technical errors. Increasingly slow connect times and 5xx Server Error status codes will deter GoogleBot from crawling a site, as revealed by Google’s Webmaster Trends Analyst, Gary Illyes, at SMX East last week.

Site Speed Matters

If your site consistently takes a long time to connect, Google will slow down or entirely stop crawling procedures so as not to overload your server. The same goes for 5xx server errors. There are a handful of potential 5xx level messages, which arise when a server receives but is unable to satisfy a valid request, and each one will alert GoogleBot to hit the road.

5xx Error Codes Keep Google Away

These status codes act as a red flag to Google, denoting that your site is experiencing trouble and may easily crash if the crawler keeps pressing. GoogleBot, which Incapsula cites as the source of 60.5 percent of all total page crawls, will eventually return to the site. However, if these issues have still been un-addressed, don’t expect the crawler to stick around. Slow load times and error pages will send both customers and bots bouncing, so be sure to fix these problems as soon as they arise.

If Google cannot effectively crawl your site, it will not register the content available on it, which makes it less likely you will appear in your customer’s search results. Fixing your load times and minimizing site errors doesn’t just benefit Google, however, but leads to positive customer experience as well.