Q and A: Will my Foreign Language site be considered Duplicate Content?

Question

Dear Kalena,

We have a website written in English that we like. However, it cannot be seen in China. In order to generate Chinese business, we will have to write a new website, and have it hosted by a Chinese hosting company.

The site will be written in Chinese characters. The layout of the site will be different, as well as the pictures, picture description and alt tags. It will also be done on a template, as is our first website. However, we really do like what the English website content says. We used Google translator on the content of our site, and discovered it gave a very accurate translation of the English site. We would like to use this translation, with a few modification, but really do not want to have a problem with duplicate content on Google. Our intent is just to do business in the Chinese market. Any advice you can give us will be most appreciated.

Best regards, Tony

Hi Tony,

Duplicate content is certainly an issue that website owners need to take into consideration when creating their sites.  Whether content is sourced from third parties (which may often be the case for product based sites), or re-used from another of your own sites (which you have effectively done) care needs to be taken.

There are some specific circumstances where duplicate content will not be a problem – and you have touched upon two of them in your question.

Translated Content

Even though 2 separate pages may be saying exactly the same thing, and the content is effectively “the same”, a Chinese language page, and an English language page will not be considered duplicate content by Search Engines – even if they are on the same domain and hosted on the same server.

As you are probably aware, automatic translation tools are notoriously unreliable, and although they can often give a translation which provides a reasonable understanding of the original content, I’ve rarely seen a perfect translation – some manual adjustment will almost certainly be necessary.  I suggest that you have the content reviewed and updated by a native Chinese speaker before you include it on your Chinese site.


Country Specific Domains / Hosting

It’s a surprisingly little known fact that sites with different domains and hosted in different countries, are unlikely to incur duplicate content penalties – even though they may contain the same content.  At SMX Sydney last year – this was confirmed by both Google and Microsoft.

So even if your Chinese hosted site with a Chinese specific domain was in English, you would be unlikely to encounter any duplicate content issues.

So, in the circumstances that you describe – i.e. a translated site, with a separate domain and hosted in a separate country, you will be quite safe and will not incur any duplicate content penalties.

Andy Henderson
Ireckon Web Marketing

Share this post with others

Q and A: Why aren’t our franchisee websites being found in search results?

Question

Hi Kalena,

I have just encountered something I am not sure about and I really need some advice on this. The site I am working on has the following issue;

It is a business with 100 franchises. The franchisees are complaining they do not come up in any searches. I have checked it and they don’t. Not even when you type in their exact URL into the search engine.

The URL structure for the business’s franchises work like this;
www.clientsite.com/studio/location (actual URL provided)

A related problem may be that there are 3 separate XML sitemaps:
1) www.clientsite.com/sitemap/sitemap.xml
2) www.clientsite.com/sitemap/location(Alpha)sitemap.xml
3) www.clientsite.com/sitemap/location(postcodes)sitemap.xml

The first is their MAIN sitemap. The other two are sitemaps for all the locations of their franchises (100 in total) These locations and their URLS are not included in the MAIN sitemap. Is having multiple sitemaps detrimental to the SEO.?

Yen

Hi Yen,

You may be surprised, but this is a VERY common issue for franchise websites that are based on a template structure, and you’ll realise that the reason the franchisee pages are not being found in search results is actually pretty simple… But first, I’ll address your sitemap query.

Multiple Sitemaps

Using multiple sitemaps is not the problem here.  If you do a search for  site:clientsite.com in Google you will see that the pages in question are actually indexed – which means that the search engines have found and crawled them.

I think though that it is probably unnecessary for your site (with just a couple of thousand pages) to have multiple sitemaps.  Multiple sitemaps are recommended (and in fact required) for very large sites, but there is a specific protocol involving a sitemaps index file (that you do not seem to be using).  You can find out more about it, with clear instructions and examples on how to correctly use sitemaps at sitemaps.org.

So the issue with your site is not indexing – it is ranking.  You don’t specify what search queries you would hope/expect the pages to be found for, but for all the examples I tried, the franchisees pages did come up for a query of their business name itself – which is more evidence that the pages are indexed OK.  From what I could see, all your franchisees seem to have a single page  of content – based on a standard template, with just the business name and contact details changed.  So in effect each franchisees page is one of 100 essentially “identical” pages on the site.

Website Templates

This is a clear issue of duplicate content which is very common for franchise sites based upon standard templates (which provide templated content rather than just the structure or design).  In this instance, each franchisee has just a single page within the same root domain (1 of 100 almost identical pages), with relatively little keyword rich content, so I am not surprised (and neither should you be) that it does not rank at all for general keyword phrases.  In fact if each franchisee had their own individual domains, with multiple pages of optimised keyword rich content – if they were based on the same template, they still would not rank any better.

I get asked about this type of issue a lot.  Excited and enthusiastic new franchisees (and multi level marketers) have setup their website using a template provided by “the business” and pretty soon begin to wonder why the eagerly anticipated enquiries and sales aren’t flooding in from their websites.

Quality, Keyword Rich, Unique Content

One of the very first things that most SEOs learn is that to get good rankings you need quality, keyword rich and UNIQUE content.  Using a templated approach is clearly NOT a strategy you should follow to get unique content.  For a graphic example try this search query : “incalculable numbers of real people”  – which is snippet of text taken from a website template for a well known international “We are Not Multi Level Marketing” organisation (probably not the one you are thinking of).

The above, fairly specific, and you might expect, “unique” query returns over 40,000 results. Is it any wonder that most of these sites will never be found through organic search?

That’s not to say that there is no value in these templated systems – many have been setup to very cleverly guide people through to the signup process – but if you “own” one of these sites you will need to use other methods to get traffic to it (PPC, Advertising, etc) and not rely on organic search traffic.

So Yen,  back to your question… If your franchisees want to be found for generic keyword searches, I suggest that they register their own domains, and create their own unique, keyword rich content rather than depending on the corporate “template”.

Andy Henderson
WebConsulting

Share this post with others

Q and A: Is it absolutely necessary to remove parameters from a URL?

QuestionHi Kalena

Is it absolutely necessary to remove numeric parameters from a url such as: www.site.com/keyword/category213.html to ensure the page is indexed and if so, why?

Thank you
Lana

Hi Lana

The URL you provided doesn’t contain any parameters. It’s a flat HTML file so search engines shouldn’t have any problems indexing it.

It’s URLs that contain “query strings” that generally contain parameters or variables. For example:

www.site.com/product.asp?productid=2

The question mark indicates the page is dynamic and therefore requires some type of server computation to display. The page URL above contains only one parameter (productid).

See more about how Google defines dynamic vs static URLs.

These days, most search engines can index pages that contain a single parameter. It is generally when multiple parameters are used in page URLs that search engine indexing problems occur.

As Google says in their Design and Content Guidelines:

“If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few”.

Multiple parameters are often needed for large sites with multiple page templates and dynamically generated content for each section of each page. Multiple parameters are separated via an ampersand (&), for example:

www.site.com/product.asp?productid=2&producttype=large

The URL above is instructing the template for the page product.asp to query the database and load the page content for product id number 2 AND specifically the data for the large version of this product whenever anyone accesses this web page.

This type of URL is more difficult for a search engine to index because they can’t identify what the multiple parameters mean or whether the URL is a unique page.

So in this case the webmaster has the option to re-write the URLs at the server level to remove the parameters or else block search robots from indexing URLs containing multiple parameters.

But if you’re in doubt, I wouldn’t worry too much about your dynamic URLs. Google and the other search engines are pretty good at parsing and determining what parameters to ignore.

——————-

Like to learn more about SEO? Download my free SEO lesson. No catch!

Share this post with others

Q and A: How do I avoid the supplemental index if I have duplicate content?

QuestionHi Kalena

If I have two blogs where I effectively have duplicate content, how could I get around that?

The duplicate content could be because the two blogs are for different audiences (read lists) or because sometimes we syndicate other articles. I thought of always placing a permalink to the original article, or should I play with the robots txt file to make sure one of these pages does not get indexed? What would be the best way around this issue? I do not want to end up in the supplemental index.

Thanks

Jen

Hi Jen

I’m not convinced you need to have multiple blogs with the same content. That said though, these days you don’t need to worry too much about duplicate content. Google does a pretty good job of filtering out pages it thinks are duplicates.

However, you DO want to control which version of the blog post or article Google considers to be the original or else Google may just decide for you. There’s a couple of ways of ensuring the pages you want are indexed and similar pages are ignored or filtered out as duplicates:

1) Include only one version of the post / article in your Google site map.

2) Make sure internal and external links only point to the original.

3) Use a noindex robots tag on the duplicate pages.

4) Block Googlebot from accessing your duplicate pages/folders via your robots.txt file.

You might also want to check the advice given in our other posts about duplicate content.

——————-

Like to learn more about SEO and duplicate content? Download our free SEO lesson. No catch!

Share this post with others

Q and A: Will Multiple Description Tags affect my Rankings?

Question

Hi Kalena,

I just noticed that my company’s homepage has five meta description tags within the head tag. Will this have any negative ramifications? Thank you,

Heather

Hi Heather,

The Meta Description Tag in itself is not likely to have a significant effect on your rankings one way or another, but it is still important because more often than not, the snippet displayed in Google search results is taken from the description tag.

Using a description tag therefore gives you some control over the “message” you are providing to searchers about what your page is about.

Having multiple description tags on the same page, will not provide any SEO benefit – only the first one will be considered – the rest will probably be ignored. However, there is a chance that search engines could consider multiple tags as “spammy”.

There is NO good reason to have multiple description tags on your site – at best it is proof of lazy coding, which increases the size of you page and slows down page load times – at worst it could be considered spamming and may result in search penalties.
What about Keyword and Robots?

Using multiple Keyword and Robot Meta Tags are also probably not a good idea. Google will aggregate the content for multiple Robots tags (but don’t advise using more than one). It is not clear how multiple keyword tags are treated – but these days their use is mostly irrelevant anyway.
Duplicate Descriptions?

While we are talking about Description Tags… You should also try not to have “duplicate” description tags – i.e. multiple pages with the same description tag.

The fact that Google webmaster tools goes to the trouble to flag duplicate descriptions as a “warning”, should provide an indication that Google doesn’t think this is a good idea either. Description tags should be unique, and provide a succinct (and keyword rich) description of the content of the page.

Andy Henderson
Ireckon Web Marketing

Share this post with others