During February 2011, Google launched a new algorithm change which can identify low quality web pages and sites. This algorithm change is known as Panda or Farmer. Panda 2.0 detects scrap sites and Google banishes them completely.
During the initial days of Panda’s launch, Google said, “We didn’t use data about what site searchers were blocking as a signal in the algorithm, but we did use the data as validation that the algorithm change was on target”. Firstly, Panda’s target was larger sites. They were affected more due to large number of web pages, traffic and signals. With the latest update, smaller sites are being affected too. When this change came in United States, the publishers whose sites were affected were unhappy about it. On the other hand, Google was quite confident about their algorithm change. They said that if a site has been impacted by this change and if the publisher thinks that his site is of high quality he should check the different aspects of his site extensively.
With the launch of Panda 2.0, Google has indirectly given a message to all publishers to get their acts together and start working on what they are good at. Google is constantly making minute changes in the algorithm which may not be noticed by many people since they are very little. Also, from time-to-time, Google does make heavy changes too, totally upgrading the algorithm. Another important thing that a publisher should keep in my mind that after the site has been hit by Panda, the changes you make to your site, won’t be reported immediately. If you start making improvements the next day after the site was hit by Panda 1.0, none of those changes will be registered to get you back in Google’s good graces until the next Panda grades are assessed.