While most SEO experts have been saying that duplicate content for certain news stories, exact quotes and a handful of other situations is not going to be a problem with Google, it actually goes further than that. In a recent video, Google’s head of search spam, Matt Cutts, explained that duplicate content can, and even should be used in some cases. As always, he says that websites should be thinking about what would best serve the actual visitor to the site.
In cases where duplicate content is best, that is what should be used. When writing something unique is going to provide visitors with a better experience, than unique is what is best. He specifically brought up sites in the financial, technical and pharmaceutical fields where there is a lot of information which really can’t be written in unique ways. The sites are serving their customers by using the exact same instructions or explanations as other sites have used in the past.
“Unless the content that you have is spammy or keyword stuffed or something like that,” Cutts says, “then there might be an algorithm or a person might take action on it, but if it’s a legal boilerplate that’s sort of required to be there, we might at most not want to count that…”
Cutts even went on to add that these types of thing may actually help a site in the SERPs. If Google were to punish sites for having appropriate legal forms, disclaimers, warnings and things like this, it would not be serving the customers’ best interests.
Of course, this information does not mean people should start making ‘auto-blogs’ which snatch information from other sites and post it up on their own. Google will not look fondly on a site which is made up of almost exclusively copied content. The point Matt Cutts was trying to make, it seems, is that site owners should not go well out of their way to make something unique when it doesn’t need to be. Re-writing the terms of service on a page, for example, does not benefit anyone. Having a unique warning text about a particular medication isn’t helpful.