Have you suddenly realized that your business, which previously appeared on the front page of a Google search on your area of expertise, has suddenly dropped off the face of the computer screen?
If so you might have fallen victim to a recent slew of algorithmic changes Google has been carrying out lately to enhance Web search results for its users. Aimed at helping users avoid undesired search results, spam and low quality content, the tweaks are a boon to many users but could be troublesome for many businesses of all sizes that rely on the Internet for exposure.
Hot on the heels of an update to combat duplicate content last month, Google has unleashed Panda – an algorithmic change aimed at purveyors of low quality Web content. In February, Google also released a Chrome browser extension called Personal Blocklist. The feature can stop those sites from appearing in your results and you’ll be helping Google out at the same time.
Related stories
Six SEO sins that Google won’t forgive
Instant offers more SEO opportunities: Google
“Generally perceived to be designed to tackle content farms, Panda destroys the rankings of sites which many Google users are sick and tired of seeing in the search engine results pages,” according to Adam Bunn, director of search engine optimization (SEO) at Greenlight, a U.K.-based search marketing and technology firm.
Like other recent tweaks to the search engine, Panda is now live in the United States and could quickly be implemented across the globe, according to Bunn.
To avoid being slammed with little or no warning, Greenlight is urging businesses to take the necessary steps now to ensure their sites rankings, and thus visibility, are not affected when Panda strikes.
Meanwhile in Toronto, Dev Basu, a search engine optimization specialist, and president of Powered by Search said Panda and its algorithmic update illustrate that Google is ratcheting up its efforts “to enhance user experience and filter out low quality Web content.”
If they are serving their clients well, businesses do not have much to worry about a Personal Blocklist, according to Basu. The feature allows a user to blacklist a Web site “but it does not have a global effect, if your other customers like you then they will not block your site.”
But Personal Blocklist and Panda appear to be corroborating each other’s results, he says. “Google was happy to find out that many of the sites which users block using Personal Blocklist were also the sites that Panda filtered.”
He said Google’s efforts to raise the search bar have even affected sites that actually contain useful and unique content. “The top ranking site Cult of Mac which provides review, news, and how-to instructions concerning Apple products, recently lost its search ranking when Panda was launched.”
Basu said Google has since rectified the problem and Cult of Mac rankings has been returning to normal.
How Panda judges content
Panda is likely a combination of more emphasis on user click data and a revised document level classifier, according to Greenlight.
User click data concerns the behaviour of real users, during and immediately after their engagement with the SERPs (search engine results pages). Google can track click through rates (CTRs) on natural search results easily. It can also track the length of time a user spends on a site, either by picking up users who immediately hit the back button and go back to the SERPs, or by collating data from the Google Toolbar or any third party toolbar that contains a PageRank meter.
This collectively provides enough data to draw conclusions about user behaviour, said Bunn of Greenlight.
He said Google might conclude that pages are more likely to contain low value content if a significant proportion of users display any of the following behaviours:
- Rarely clicking on the suspect page, despite the page ranking in a position that would ordinarily generate a significant number of clicks
- Clicking on the suspect page, then returning to the SERPs and clicking a different result instead
- Clicking on the suspect page, then returning to the SERPs and revising their query (using a similar but different search term)
- Clicking on the suspect page, then immediately or quickly leaving the site entirely
Bunn said Google probably compares the engagement time against other pages of similar type, length and topic. For example, Google may consider a site’s “bounce rate” or whether users are “quickly” leaving a site after viewing just one page.
Bunn said Google could give a percentage likelihood of a page containing low value content, and then any page that exceeds a certain percentage threshold might be analysed in terms of its user click data. This keeps such data as confirmation of low quality only, rather than a signal of quality (high or low) in its own right. So it cannot be abused by webmasters eager to unleash smart automatic link clicking bots on the Google SERPs.
What can you do to survive the Panda attack
Basu of Powered by Search warned that low quality content can affect an entire domain. “Even if the majority of contents in a business’s Web site are high quality and unique but a couple of pages are seen as poor quality by Panda the site could still be filtered out,” he said.
To avoid this, Basu suggests that businesses identify and update these “weak” pages. Using a Web traffic analytics tool, businesses can determine which pages of their site received significantly lower traffic around the time when Panda first launched in February. You can also checkout the traffic around the time when Google updated Panda in March to see if traffic for the pages changed.
If you’ve identified the weak pages, you can apply some of these changes found in our article: Google’s content farm crackdown to affect Canadian rankings soon
For more information on a how to survive a Panda attack click on Google Panda update survival guide
Generally, Basu said, the idea is to avoid posting content that were “scraped” from other sites or altering such materials.
For instance, e-commerce sites using re-posted product descriptions can add other material such as high resolution images of the product. An employee or hired writer can alter the text so that it addresses the needs of the business’s clients.
Employing user generated content such as customer reviews or comments is another way of creating unique content, said Basu.
Businesses should aim to attract as many clicks as possible by optimising the message being put across to users with the page title, meta description and URL, said Bunn or Greenlight.
“Once users land on the site, they should be kept happy through the provision of a rich experience, with as much supporting multimedia as possible, and clear options for where to go elsewhere on the site if the first landing page does not ‘do it’ for them in the first instance,” he said.
“Regardless of what Google is doing, these are all the basic requirements for almost any online business, which get at the heart of what Google algorithm updates, and indeed SEO (search engine optimization), are all about,” Bunn explained.
If you feel your site has been unfairly penalized or are not sure if your site is a Panda target let Google know about it by contacting the search engine’s Webmaster central.
Nestor is a Senior Writer at ITBusiness.ca. Follow him on Twitter, read his blogs on ITBusiness.ca Blogs and join the ITBusiness.ca Facebook Page.