Google SEO Penalties: the Panda Algorithm

What Is Google’s Panda SEO Penalty Algorithm?

Google’s Panda algorithm is one of many algorithms that have been released in the last few years. Google patented the Panda algorithm in September 2012, but the patent wasn’t granted until March 2014. The algorithm is designed to prevent lower quality sites from ranking in the SERPs – it does so use a threshold that is partly calculated using a complex formula, and partly set by humans that are employed by Google as quality raters.

Google Panda looks at the number of inbound links that a site has, as well as reference queries, and how often the site’s brand is searched for. It then uses that to create something known as the ‘sitewide modification factor.’ This is then used to set the popularity of individual pages. If a page does not meet a certain threshold, then the modification factor is applied to it, and that’s how the rankings for the SERPs are determined – based on the quality of the page and the site that it is on. Low-quality pages will appear lower in the SERPs regardless of the site they’re on, and low-quality sites will rank very poorly at all times.

Google Panda affects entire sites, or sections of sites, rather than just individual pages. When the algorithm update was first released, there were new updates pushed to it on an almost monthly basis, but buy March 2013 this was scaled back, and future updates were integrated into the algorithm in a less ‘disruptive’ way. In July 2015, Google issued their first ‘slow rollout’ for Panda 4.2.

Google Panda has proven problematic for many webmasters, since the ‘quality’ requirements are quite opaque, and some webmasters that have not updated their sites for some time, or that were given bad advice by SEO ‘experts’ may have fallen prey to penalties because of that.

The good news is that since any penalties given out by Google Panda are algorithmic penalties, they can be recovered from – if a website deletes duplicate content, improves its navigation, disavows bad links, and generally ‘cleans up its act’, then problems with Panda, Penguin and Hummingbird can be recovered from – it’s only the dreaded ‘manual spam action’ penalty that is very, very hard to come back from – the others may take time and patience, but they are certainly not the kiss of death that people think they might be.

A lot of webmasters don’t even know if they’ve been hit by Panda – but you can look at your website’s history in Google Webmaster Tools and get an idea of where you stand. Plastic Surgeon websites often have aggressive SEO done to them and are frequently targeted for SEO penalties by Google (SEO resource for help if this is you). It’s easy enough to look at traffic charts and compare any sudden changes in traffic with the dates that Google rolled out updates. If you see a change around the time of an update – you could have been one of the sites affected by it. Changes before the update could still mean the same thing too, as sometimes Google pushes updates to a small number of queries as a ‘test’ before unleashing the changes on the wider web – which creates an early “Google Dance” where a few websites are affected, and then others come later.

Google is constantly fighting against screen scrapers, content spinners, article marketers and auto-bloggers, and Google Panda is one of their front lines of defense against those problems. It’s not perfect – but it has achieved a lot, and it’s likely that it will continue to serve Google well with updates – making the web a better and more informative (as well as entertaining!) place for us all.

Share This:

Comments are closed.