A Panda, a Penguin, a Hummingbird and a Payday Loan walk into a bar. The bartender says…
Okay, so making jokes that involve the names of Google Algorithm updates is easier said than done. Which is just as well, because for thousands of websites over the last two years the effects of these updates have been no laughing matter.
The zoo-related algorithms in particular have caused a rankings storm. Both exist to enhance Google’s overall service, using separate strategies to punish low quality or irrelevant web pages.
Panda 4.0 was a serious revamp of the Panda algorithm. Many of the biggest losers from the fall-out included heavyweights like eBay and Ask.com – the former lost a reputed 80% of its organic rankings. Many aggregator sites (think comparison or celeb sites) also took a major hit. Conversely, a plethora of smaller and medium sites received a notable boost in the ranking system.
So what did these big companies do to find themselves on the receiving end? How could some of the most popular websites in the world get something notably wrong?
Ultimately, they failed to supply enough quality-driven content and suffered from factors like heavy advertising, regurgitated content and redundant pages. Sites rewarded by the update (including Zimbio and Medterms.com) were perceived to have done the opposite by offering regularly updated, user driven material.
This Time It’s Personal
Released last month, the Panda 4.1 ‘roll out’ continues the war on so called ‘thin content’. The consequences of this are two-fold: websites punished by the previous update have a chance at redemption (providing they took measures to address what went wrong previously); also, sites that escaped penalties last time but failed to react may suffer a fall in traffic this time around.
Staying ahead of the game and following the rules shouldn’t be that hard: Google itself published an article on what makes a quality site. Pointers to look out for included things like:
- Would you share the article or bookmark it?
- Does the content provide a comprehensive explanation of the subject in question?
- Trustworthiness: i.e. would you trust the content you’ve read?
Running parallel to all this is the Penguin algorithm, which focuses on the quality of SEO link building. Penguin 2.1 weeds out low performers by monitoring things like links from malicious sites and spam comments. This algorithm wreaked havoc last year with huge numbers of sites experiencing a negative impact. Indeed, many critics have accused Google of creating a Frankenstein algorithm beyond their control!
The question is: will the next Penguin update (release TBC – recent web chatter has suggested by the end of 2014, but this is under pretty broad scrutiny) look to reverse this and continue the good work by Panda? Or will it cause more problems than it solves?
Both updates have been accused of being overly draconian, but the opportunity for small businesses smaller, quality-driven websites to achieve top rankings can’t be ignored. This is particularly relevant for small businesses that rely on decent search rankings to boost their profile, and certainly helps the new wave of content managers this year , as well as keeping the big corporates on their toes.
Which leads to a dilemma: if future updates result in EVERY site being good quality then where do we draw the line? In a utopia built on top-quality content, how do we discern between the very good and the very best? How this pans out remains to be seen but for the time being the onus on us as marketers is to keep pushing for clients and customers – to always be valuable.