Well according to Greg Grothaus from the Google Search Quality team it is. We are used to getting tips from Matt Cutt’s YouTube channel, great channel to subscribe to by the way if you do not already subscribe, but this time someone from Search Quality has weighed in on the subject of duplicate content. For a long time the echoes of “No Duplicate Content” rings throughout the SEO community. Well those day are for all practical purposes over. In this ever changing SEO landscape, this is just another myth that has been debunked. Mr. Grothaus posted a video about the duplicate content myth here, along with a presentation on the Google Webmaster Central blog covering multiple site issues and duped content issues that tacticians continue to face when trying to improve web rankings, specifically for Google usually. Greg starts by clearing up a popular myth about duplicate content, specifically that websites get penalized for having it (duplicate content). This is not to say there might be other implications of having the same content across multiple sites/channels, but Google itself does not penalize you for it.
Mostly what the presentation boils down to is how you are using duplicate content, perhaps content in your feed (if you don’t no follow) is duplicated from what is on your sites blog or pages, Google realizes that this is not deceptive, and in fact Google recognizes this. What they are interested in, is the same thing they have always been interested in, providing the best experience for users (searchers). That means they have to get smart about penalizing people for using duplicate content as spam as opposed to those who unintentionally use duplicate content as a legitimate way to spread content virally, perhaps a widget that feeds your latest SEO content on another site that does have an audience for this content, but doesn’t publish it’s own.
Greg likens this to the bold tag. Spammers use bold tags, among other things, but that doesn’t mean that Google is going to penalize all instances of bold tags across the board. What they are really doing is trying to diversify the results for their audience. Some ways you might be shooting yourself in the foot with duplicate content, without any Google intervention, would be the dilution in Page Rank. If you have links pointing to the same content, but under separate URL’s, such as …/content1.com, …/?p=123,…/?index.html, etc. then you are hurting your Page Rank as it’s dispersed across multiple url’s instead of being concentrated on one page. If this is the case on your site you might want to resolve this, Greg goes on to say that Google might spend a lot of time going through your duplicate content across multiple URL’s and might not have time to run deep links through your site and you run the risk of content being skipped over and subsequently not get indexed. You also might want to look in to the canonical url. This allows you to let Google know which page you want to be as the actual source of the content, the one to be indexed. You can find out more about canonical URL’s here or do a Google search for it and find out how you can apply this rule to your site to get the most out of your content and web rankings in Google. This is still fairly new, the use of canonical URL’s to combat duplicate content but sounds like a great idea whose time has come. Now let’s just get Yahoo & Microsoft on board, set some standards, would these canonicals pass PR? All good discussions to have.
[tags]duplicate content, canonical urls, google, matt cutts[/tags]