Q and A: How do I avoid the supplemental index if I have duplicate content?

QuestionHi Kalena

If I have two blogs where I effectively have duplicate content, how could I get around that?

The duplicate content could be because the two blogs are for different audiences (read lists) or because sometimes we syndicate other articles. I thought of always placing a permalink to the original article, or should I play with the robots txt file to make sure one of these pages does not get indexed? What would be the best way around this issue? I do not want to end up in the supplemental index.

Thanks

Jen

Hi Jen

I’m not convinced you need to have multiple blogs with the same content. That said though, these days you don’t need to worry too much about duplicate content. Google does a pretty good job of filtering out pages it thinks are duplicates.

However, you DO want to control which version of the blog post or article Google considers to be the original or else Google may just decide for you. There’s a couple of ways of ensuring the pages you want are indexed and similar pages are ignored or filtered out as duplicates:

1) Include only one version of the post / article in your Google site map.

2) Make sure internal and external links only point to the original.

3) Use a noindex robots tag on the duplicate pages.

4) Block Googlebot from accessing your duplicate pages/folders via your robots.txt file.

You might also want to check the advice given in our other posts about duplicate content.

——————-

Like to learn more about SEO and duplicate content? Download our free SEO lesson. No catch!

Share this post with others

Q and A: Will Multiple Description Tags affect my Rankings?

Question

Hi Kalena,

I just noticed that my company’s homepage has five meta description tags within the head tag. Will this have any negative ramifications? Thank you,

Heather

Hi Heather,

The Meta Description Tag in itself is not likely to have a significant effect on your rankings one way or another, but it is still important because more often than not, the snippet displayed in Google search results is taken from the description tag.

Using a description tag therefore gives you some control over the “message” you are providing to searchers about what your page is about.

Having multiple description tags on the same page, will not provide any SEO benefit – only the first one will be considered – the rest will probably be ignored. However, there is a chance that search engines could consider multiple tags as “spammy”.

There is NO good reason to have multiple description tags on your site – at best it is proof of lazy coding, which increases the size of you page and slows down page load times – at worst it could be considered spamming and may result in search penalties.
What about Keyword and Robots?

Using multiple Keyword and Robot Meta Tags are also probably not a good idea. Google will aggregate the content for multiple Robots tags (but don’t advise using more than one). It is not clear how multiple keyword tags are treated – but these days their use is mostly irrelevant anyway.
Duplicate Descriptions?

While we are talking about Description Tags… You should also try not to have “duplicate” description tags – i.e. multiple pages with the same description tag.

The fact that Google webmaster tools goes to the trouble to flag duplicate descriptions as a “warning”, should provide an indication that Google doesn’t think this is a good idea either. Description tags should be unique, and provide a succinct (and keyword rich) description of the content of the page.

Andy Henderson
Ireckon Web Marketing

Share this post with others

Q and A: How do I avoid duplicate content created by my CMS for product pages on my site?

QuestionDear Kalena…

You’ve helped us out with a couple of problems over the years ~ thanks again. Don’t have a problem this time but I do want to get your opinion/guidance so I can maybe AVOID a problem.

We handle over 5,000 products, and we want to create a page for each product using an automated page generator. Same as what thousands of other people do. Nothing fancy and no SEO tricks. Just a brief description of the item, price & how to order.

I’ll be using a template, of course, and about 75% of the words (excluding shared borders) will be common to all pages. The other 25% of words on a given page will be unique to the product/page in question.

I may be overly cautious, but I’ve learned the hard way that what seems like a good idea or what the rest of the herd is doing might not be acceptable to the SE’s, especially if not executed properly. We have a fairly well-performing website and the stakes get higher as we grow. So, any tips on what to do / not do when creating these individual product page would be appreciated.

Thanks
Rick

Dear Rick,

Sometimes it’s possible to reduce duplicate content by placing that content in a dedicated section of your website and then linking to it where necessary (this can apply to things like shipping/handling, product guarantees, returns policies and terms & conditions… which some store owners will try and display on every page but could quit easily be put elsewhere).

Another way to make the search engines focus on the unique content is by using emphasis tags (such as H1, H2, bold, italics etc.) and use them sparingly (or don’t use them at all) in your page header, footer and other duplicate parts of the page. This will help the spiders isolate your unique page-specific content as well as drawing your readers attention to the most important parts of the page.

You could also try and setup a feature that allows users to add reviews or feedback on each of the products. This user-generated content would become yet another source of additional unique content for each page (and what’s better is you didn’t have to write it yourself).

Hope this helps!

Peter Newsome
SiteMost SEO Brisbane

Share this post with others

Q and A: Why should you prevent robots from indexing PPC landing pages?

QuestionHi Kalena

Quick question.

One of the lessons in your SEO201 course says that if you run PPC campaigns and use landing pages for these campaigns that are similar in layout and content, you should prevent search engines robots from indexing them. Please explain why?

In my thoughts, the more files the search engines index the more exposure you may get.

Thanks
Alex

—————————————————————————————————

Hi Alex

PPC landing pages can often look nearly identical and the only difference between them is the target keywords used. Web pages that look too similar are sometimes filtered out of the search results as duplicate content. Too much duplicate content on a domain may impact it’s ability to rank highly, therefore I always recommend preventing robots from indexing landing pages, using your robots.txt file.

If you are using existing pages on your site as PPC landing pages and they aren’t too similar to each other, there is no need to block robots from indexing them. Make sense?

Share this post with others

Q and A: Does Google automatically search sub-directories?

QuestionDear Kalena…

Does Google automatically search sub-directories? Or do I have to have a ‘Links’ page to force google to index the sub-directories?

Also, I was reading about ‘redundant’ content. I have a business directory which will eventually have thousands of pages with the only main difference in content being: {Company} {City} {ST} and {Subject1}. Will Google view this as redundant content?

Best Regards,

Steve

Dear Steve,

For Google to index your sub-directories, you will need some links pointing to them. These links can simply be internal navigation links and if you have a large website, it’s also advisable to include a sitemap that links to all your pages and sub-directories within your site.

In regards to your redundant content query – it’s best SEO practice to have at least 250 words of unique content per page. So if all the pages are the same other than the contact details – then yes, it would be considered redundant content.

My advice would be to offer a one-page listing for each company and on that page have a small blurb about the company, their contact details and a feature that allow users to add feedback/comments/reviews. This should provide enough information for Google to index without causing redundant or duplicate content issues.

Hope this helps!

Peter Newsome
SiteMost

Share this post with others