A Patent From The Panda That Almost Was

I hope you enjoy this post!  To get expert ideas into how to grow your business faster online, click here.

Craig Smith  |  Founder & CEO

The Panda update took aim at low-quality content on the web, promoting sites that Google deemed to have high-quality content while penalizing those that didn’t. Like other algorithm updates, Panda shook up the SEO world by shifting site ranking unpredictably, but a recently approved patent points to a Panda far more aggressive than the one finally released.

The benign sounding “Processing web pages based on content quality,” inventors Brandon Bilinski and Stephen Kirkham outline how Google’s algorithm would index a site and decide if the site contained high quality content, or if it was one of low quality, such as a content farm or a parked domain page. This portion of the algorithm, at least, seems to have made it into the final Panda update, but the patent goes further.

If the site was determined to be low quality, instead of merely penalizing the site, Google would allow it to continue to rank but instead insert a warning should a user click on it. This warning would inform the user that Google had determined the site they selected to be one of low quality or a suspected content farm. Google would then offer to redirect the user to more relevant search results, or allow them to bypass the warning and continue to the site.

While allowing a less-relevant site to rank may seem counter-intuitive, the resulting warning would potentially be more damaging to web developers who utilize these tactics. Google already uses a similar warning for sites it suspects to be infested with Malware, so it’s likely that users would accept one of the redirects the search giant offered them.

This would deny the website traffic while highlighting websites that went astray of Google’s guidelines. Google spam guru Matt Cutts said that the goal of his team wasn’t merely to discourage those who participated in “Black Hat” SEO Tactics, but to break their spirits, and this form of public shaming could certainly go a long way towards accomplishing that.

By the time Panda launched, it didn’t feature this warning system. Perhaps Google felt the potential damage done to legitimate sites through false positives outweighed the benefits of punishing black hat tactics. After all, a penalty is far easier to recover from than a public shaming. But hints of this technology remain.

In a recent blog post, Google outlined a plan to warn users, particularly mobile users, if the site selected was not optimized for their device. Google also used Panda’s algorithm to penalize low-quality sites instead of publically shaming them.

While this warning didn’t make it into Panda, it does give us a glimpse of where Google is headed. Google’s reputation and revenue depends on users trusting them to provide relevant, high quality search results. With their algorithm improving every release, it may get accurate enough that Google chooses to use this patent, or a similar one, to publically shame developers who do not adhere to their guidelines.

This is why developing relevant, high-quality content should be the goal of every web developer. Google is constantly updating their algorithm in the hunt for relevant results for their users, so sites built to please the user and not ones that exploit Google’s algorithm are the ones that will survive Google’s next “Panda,” no matter how aggressive this one turns out to be.

Source: Moz

Table of Contents

The First Step to Fast SEO Growth

Outline the core pillars to your technical and content SEO strategies, and take your SEO to the next level with our free SEO Playbook (PDF).

About Trinity

Helping online brands grow since 2006. How can we help you? Find out today!

Get a FREE website diagnostic consultation and report for your site.


How to Scale SEO Work NOBODY Wants to Do to Grow Traffic