Since a few months, Google is faced to numerous complaints about the bad quality of its results. Is Matt Cutts anti-spam team totally outdated? The new extension Google Chrome allowing to block some sites in the results, developed by the Google team, is it a solution? What should we think about the case JC Penney (link spam)?
This article tries to sum up the situation.
The extension Google Chrome “Personal Blocklist Extension”
Google just released an official extension Chrome, developed by the team fighting against the spam (directed by Matt Cutts), allowing to its users to block some sites in results. If you are fed up to see sites of bad quality, you can block them. Know simply that Google will be aware of every sites you have chosen to block, in order to improve their algorithm.
You can block a entire site (by its domain name) or simply a sub-domain in particular. The extension, presented like experimental, is available in English, French, German, Italian, Portuguese, Russian, Spanish and Turkish. By the own admission of Matt Cutts who came on the Google blog to promote this extension, it’s a part of the fight against the content’s farms.
Even if the first users who download the extension have a unanimous positive opinion, we can ask us if this extension isn’t a magnificent confession of impotence for Google! Isn’t it the Google job to ensure that spammers do not appear in the results ? Does someone can bypass the use of this extension blocking artificially some sites, via ghosts’ users, in order to penalize concurrent sites?
If this extension is massively used, Google could explore its data as a new criterion for its ranking algorithm. We can also imagine that after, some links “block this site” will appear next to all results, in the SERP standard interface (without extension). Matt Cutts has told that they think doing it, but in a technical point of view it’s longer to code (for the moment with the extension “Personal Blocklist”, the blocking is done client side, and not server side). In some way, Google can base itself on the collective denunciation massive spam, like Gmail does it so well. But is it really adjustable to the webspam ?
The affair JC Penney
The site of JC Penney, a very big business site in the USA, has been caught in the act of spam, revealed by an article published in the New-York Times. Because of this public revelation, Matt Cutts has been forced to react by penalizing the site jepeney.com manually. Google didn’t’ need this matter, being already criticised for its bad gestion of the quality of its results. We have ever seen in January an amazing episode where Google accuses Bing copying it… which had created a diversion.
That leads us to some questions:
Why does it must an article in the NYT to ensure that Google handles this case and applies quality guidelines (understand: that Google penalizes the offending site). Even when Matt Cutts indicates us that his team has already spotted this site has breaking the rules.
Does the fact that JC Penney invest millions in AdWords change something? Or it is simply impossible to delete results, because of his notoriety (the Internet users couldn’t understand that a major market actor is not found on Google)?
Does that change something for Google if no link has been buy ?
Why can we be sure that these links have been set up by the “offending” company?
In fact, it could also be a malicous action of a competitor… Doesn’t Google pretend that we are only responsible of the links we make on our site?
Will Google finally change its effective algorithm for this type of thematic links out too optimized (anchor text) won’t be little or not considered?
We were talking about Google but Bing is as much concerned…
Awesome post and tips, thanks for sharing