Ever since the Penguin 2.0 update from Google there have been thousands of site owners who have been struggling to figure out why their site isn’t ranking well. Going through potentially tens of thousands of links to find which ones are causing problems can be difficult, and a great number of them have been reaching out to Google for assistance. This was addressed by Google’s head of search spam, Matt Cutts in a video.
Matt reports that while his team has given some guidance about how to get rid of ‘bad links’ and how to discover which links are of low quality, he is largely telling the site owners to do their own leg work. He says that publishers will have to do their own research to discover what it is that is causing their sites to drop in the SERPs.
He states, “We’re working on becoming more transparent and giving more examples with messages as we can.” This essentially means that beyond the automated messages some site owners have, and will continue to receive, they don’t have any other information to give. Matt goes on to say, “I wouldn’t try to say ‘Hey, give me more examples’ in a reconsideration request.”
Matt recommends that people go through their own link profile using tools like Google’s own Webmaster Tools and attempt to find which links are coming from low-quality sites. It is also a good idea to work together with other webmasters or SEO experts to help get more information about what could be causing a site to be de-listed or drop in the SERPs.
Of course, the best thing any site owner can do is to only follow approved SEO techniques, but even then competitor sites could build negative links back to your site. This is a difficult subject to address for many site owners, and after these latest statements from Cutts it seems like Google isn’t going to be doing much to help.