Hi Kalena
If I have two blogs where I effectively have duplicate content, how could I get around that?
The duplicate content could be because the two blogs are for different audiences (read lists) or because sometimes we syndicate other articles. I thought of always placing a permalink to the original article, or should I play with the robots txt file to make sure one of these pages does not get indexed? What would be the best way around this issue? I do not want to end up in the supplemental index.
Thanks
Jen
Hi Jen
I’m not convinced you need to have multiple blogs with the same content. That said though, these days you don’t need to worry too much about duplicate content. Google does a pretty good job of filtering out pages it thinks are duplicates.
However, you DO want to control which version of the blog post or article Google considers to be the original or else Google may just decide for you. There’s a couple of ways of ensuring the pages you want are indexed and similar pages are ignored or filtered out as duplicates:
1) Include only one version of the post / article in your Google site map.
2) Make sure internal and external links only point to the original.
3) Use a noindex robots tag on the duplicate pages.
4) Block Googlebot from accessing your duplicate pages/folders via your robots.txt file.
You might also want to check the advice given in our other posts about duplicate content.
——————-
Like to learn more about SEO and duplicate content? Download our free SEO lesson. No catch!