You are here:  Home  /   BLOG  /   Understanding Penguin 1.1: Be Safe from Updates in 3 Easy Steps

Understanding Penguin 1.1: Be Safe from Updates in 3 Easy Steps

Share/Bookmark

So, first things first! What is the difference between PENGUIN and PENGUIN 1.1?

The Penguin update was a change in a module within the search engine algorithm, while Penguin 1.1 is merely a data refresh within that same module. Matt Cutts clearly explained this in his post way back in 2006.

By now, there will be many people out there talking about PENGUIN 1.1 and web spam and how to avoid it, etc., etc., etc. But do we truly know the core of PENGUIN 1.1 and how to tackle it? PENGUIN 1.1 was primarily associated with data refresh, plus a host of other, as usual, “SECRET SIGNALS.”

This data refresh is probably best known to SEO old-timers as the “Google Dance.” A data refresh basically keeps the input (search query) and output (relevant information) of the algorithm on which the data is processed constant and changes the order of data going in and out. In this case, the data is a particular website page and its surrounding value (links, PR, domain authority, relevancy of content, etc.). If this order of data remains constant and the input and output mechanisms change, then it is termed an algorithmic update.  I hope this makes things simple enough to understand.

Next is the topic of WEBSPAM!

Web spam, according to me, is basically a Molotov cocktail with three prime elements:

  1. Excessive use of Keywords within a particular page, article, or content piece, primarily to show importance among others
  2. Over-excitement in optimization
  3. A linking system that clearly shows that it has been purchased

Although there are many more factors that come into play, the most harm is caused by the interaction of these three primary elements. Once this is ignited, it will definitely alert Google’s systems about possible harm that could be caused. The effect is pretty plain and forthcoming: the immediate devaluation of pages, loss of PR, disintegration of rankings overnight, etc.

Remember, most or probably all of this is recorded within Google’s systems, and when the time comes (via an algorithmic update or a data refresh), Google already has bait to target. This data is then collectively used to eradicate rankings, and thus more PENGUINS and PANDAS are born, the latest being PANDA 3.7.

How do I prevent/rectify this?

  • SMARTER LINKS: At the very core of all SEO activities lies a link environment for each website. It is very evident that that link system will go forward and create a niche web within the WWW that any search engine crawler can associate and relate to a particular website. Considering advances in technology, search engine crawlers are now capable of understanding the semantics (meaning of the text) associated with such links. When this profile of links, semantic associations, and website content are fixed into place, search engines are bound to authenticate between real links and paid links.

The best thing to do here would be to take an approach where a consensus is reached on the amount of necessary paid links and workable real links to achieve optimum results without getting falsely caught.

We have seen many cases of blog links and really good directory links that looked very natural and untraceable as paid links. Over time, these niche blogs start to expand their horizons in all directions, and thus get spammed. This is noticeable because of the amount of outgoing links to a particular page from one blog post or from a particular directory. Google WebmasterTools is the easiest way to locate these now defunct links.

The snapshots below show a single page getting nearly 4,000 links from a single domain.

4000 Links



4000 Links 2



So many incoming links are a direct indication of web spam, however unintentional the motive may have been. The best way to rectify this is to remove the source links as soon as possible. Then, at least Google will know you are looking at improvements, and it will then start to reconsider your website.

When creating links in the future, care has to be taken that thorough Quality Checks are done on each of these link avenues. The quality check should include:

  • Domain Value
  • Type of Link Avenue (whether relevant or irrelevant to business/product/service)
  • Pagination Issues on the Domain (VERY IMPORTANT TO CHECK; one link can easily get copied across multiple pages.)
  • IP Address Check (Is IP address of the link avenue the same or different? With IPv6 coming into effect, this might become much easier to locate and deduce.)
  • Caching and Indexing of the Prospective Link Avenue (needs to be checked)

It is completely fine if you get one or two links from any one domain. At least you can be certain that future penalties will not occur.

  • NATURAL OPTIMIZATION: Google always stresses this aspect to all queries thrown against it. Many SEOs have always believed that natural optimization only means not crossing the borderline of keyword stuffing into Title, METAs, content etc. One main thing that is usually forgotten is the match type of the keyword used, which causes unintentional harm in the long run.

Optimizing a website using an SEO-friendly, naturalistic approach is currently the need of the hour. As an example, take any page on a highly optimized website and count the number of exact keyword matches within the following:

  • Navigation Content
  • Footer Content
  • Alternate Text Within Images
  • Main Content
  • Internal Links with Exact Keyword Matches
  • Title Tag
  • META Tags

Count the number of times the same exact keyword is repeated, and then judge whether optimization is indeed truly natural. This optimization, coupled with a link strategy, will ultimately determine the quality factor of a particular website.

I had a website that had two keywords that brought in about 100 visits/month each. Since PENGUIN, traffic from these search queries has dropped to negligible levels. We used the measures above as a mix of natural optimization and removal of now-irrelevant links, and within two weeks of making these changes, the following results were observed for two of our keywords:

Two Keywords







  • VARIATIONS: Variations of keywords and the vocabulary used in articles and other textual pieces both on and off the page have become even more important with PENGUIN 1.1. This is important to such an extent that even matching keyword anchor text is sometimes considered web spam. The best way to avoid this is to create variations.

How do I create variations?

There is a saying, “Fight fire with fire.” Much in the same manner, Google needs to be tackled using Google itself. For example, if the target keywords are “holiday packages,” “cheap tickets,” and “buy prom dresses,” use Google’s Contextual Targeting Tool within the AdWords interface to obtain many different variations that can be used as anchor text, article link holders, etc.


Once you have selected your required variations, check for Search Volumes and Competition, and get your very own variations straight from Google to use on Google.

By keeping these two conditions in mind at all times, there is a very high chance that you might see an improvement in rankings across different mediums and considerably reduce the probability of being penalized by any update in the future. After all this, the aspects of interlinking variations with creating links will only help if they are complemented with the natural feel of on-page optimization.

To conclude, remember that PENGUINs are very fond of phish, and anything phish-y about your website will attract a colony of PENGUINs. SEO works on the fundamentals of growing webs to attract spiders and users, while lowering phishy-ness so as to repulse PENGUINs.

“));} //OBEND:do_NOT_remove_this_comment //–>



0