Q and A: Is it absolutely necessary to remove parameters from a URL?

QuestionHi Kalena

Is it absolutely necessary to remove numeric parameters from a url such as: www.site.com/keyword/category213.html to ensure the page is indexed and if so, why?

Thank you
Lana

Hi Lana

The URL you provided doesn’t contain any parameters. It’s a flat HTML file so search engines shouldn’t have any problems indexing it.

It’s URLs that contain “query strings” that generally contain parameters or variables. For example:

www.site.com/product.asp?productid=2

The question mark indicates the page is dynamic and therefore requires some type of server computation to display. The page URL above contains only one parameter (productid).

See more about how Google defines dynamic vs static URLs.

These days, most search engines can index pages that contain a single parameter. It is generally when multiple parameters are used in page URLs that search engine indexing problems occur.

As Google says in their Design and Content Guidelines:

“If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few”.

Multiple parameters are often needed for large sites with multiple page templates and dynamically generated content for each section of each page. Multiple parameters are separated via an ampersand (&), for example:

www.site.com/product.asp?productid=2&producttype=large

The URL above is instructing the template for the page product.asp to query the database and load the page content for product id number 2 AND specifically the data for the large version of this product whenever anyone accesses this web page.

This type of URL is more difficult for a search engine to index because they can’t identify what the multiple parameters mean or whether the URL is a unique page.

So in this case the webmaster has the option to re-write the URLs at the server level to remove the parameters or else block search robots from indexing URLs containing multiple parameters.

But if you’re in doubt, I wouldn’t worry too much about your dynamic URLs. Google and the other search engines are pretty good at parsing and determining what parameters to ignore.

——————-

Like to learn more about SEO? Download my free SEO lesson. No catch!

Share this post with others

Q and A: How do I avoid the supplemental index if I have duplicate content?

QuestionHi Kalena

If I have two blogs where I effectively have duplicate content, how could I get around that?

The duplicate content could be because the two blogs are for different audiences (read lists) or because sometimes we syndicate other articles. I thought of always placing a permalink to the original article, or should I play with the robots txt file to make sure one of these pages does not get indexed? What would be the best way around this issue? I do not want to end up in the supplemental index.

Thanks

Jen

Hi Jen

I’m not convinced you need to have multiple blogs with the same content. That said though, these days you don’t need to worry too much about duplicate content. Google does a pretty good job of filtering out pages it thinks are duplicates.

However, you DO want to control which version of the blog post or article Google considers to be the original or else Google may just decide for you. There’s a couple of ways of ensuring the pages you want are indexed and similar pages are ignored or filtered out as duplicates:

1) Include only one version of the post / article in your Google site map.

2) Make sure internal and external links only point to the original.

3) Use a noindex robots tag on the duplicate pages.

4) Block Googlebot from accessing your duplicate pages/folders via your robots.txt file.

You might also want to check the advice given in our other posts about duplicate content.

——————-

Like to learn more about SEO and duplicate content? Download our free SEO lesson. No catch!

Share this post with others

Q and A: Will Multiple Description Tags affect my Rankings?

Question

Hi Kalena,

I just noticed that my company’s homepage has five meta description tags within the head tag. Will this have any negative ramifications? Thank you,

Heather

Hi Heather,

The Meta Description Tag in itself is not likely to have a significant effect on your rankings one way or another, but it is still important because more often than not, the snippet displayed in Google search results is taken from the description tag.

Using a description tag therefore gives you some control over the “message” you are providing to searchers about what your page is about.

Having multiple description tags on the same page, will not provide any SEO benefit – only the first one will be considered – the rest will probably be ignored. However, there is a chance that search engines could consider multiple tags as “spammy”.

There is NO good reason to have multiple description tags on your site – at best it is proof of lazy coding, which increases the size of you page and slows down page load times – at worst it could be considered spamming and may result in search penalties.
What about Keyword and Robots?

Using multiple Keyword and Robot Meta Tags are also probably not a good idea. Google will aggregate the content for multiple Robots tags (but don’t advise using more than one). It is not clear how multiple keyword tags are treated – but these days their use is mostly irrelevant anyway.
Duplicate Descriptions?

While we are talking about Description Tags… You should also try not to have “duplicate” description tags – i.e. multiple pages with the same description tag.

The fact that Google webmaster tools goes to the trouble to flag duplicate descriptions as a “warning”, should provide an indication that Google doesn’t think this is a good idea either. Description tags should be unique, and provide a succinct (and keyword rich) description of the content of the page.

Andy Henderson
Ireckon Web Marketing

Share this post with others

Q and A: How do I avoid duplicate content created by my CMS for product pages on my site?

QuestionDear Kalena…

You’ve helped us out with a couple of problems over the years ~ thanks again. Don’t have a problem this time but I do want to get your opinion/guidance so I can maybe AVOID a problem.

We handle over 5,000 products, and we want to create a page for each product using an automated page generator. Same as what thousands of other people do. Nothing fancy and no SEO tricks. Just a brief description of the item, price & how to order.

I’ll be using a template, of course, and about 75% of the words (excluding shared borders) will be common to all pages. The other 25% of words on a given page will be unique to the product/page in question.

I may be overly cautious, but I’ve learned the hard way that what seems like a good idea or what the rest of the herd is doing might not be acceptable to the SE’s, especially if not executed properly. We have a fairly well-performing website and the stakes get higher as we grow. So, any tips on what to do / not do when creating these individual product page would be appreciated.

Thanks
Rick

Dear Rick,

Sometimes it’s possible to reduce duplicate content by placing that content in a dedicated section of your website and then linking to it where necessary (this can apply to things like shipping/handling, product guarantees, returns policies and terms & conditions… which some store owners will try and display on every page but could quit easily be put elsewhere).

Another way to make the search engines focus on the unique content is by using emphasis tags (such as H1, H2, bold, italics etc.) and use them sparingly (or don’t use them at all) in your page header, footer and other duplicate parts of the page. This will help the spiders isolate your unique page-specific content as well as drawing your readers attention to the most important parts of the page.

You could also try and setup a feature that allows users to add reviews or feedback on each of the products. This user-generated content would become yet another source of additional unique content for each page (and what’s better is you didn’t have to write it yourself).

Hope this helps!

Peter Newsome
SiteMost SEO Brisbane

Share this post with others

Q and A: Why should you prevent robots from indexing PPC landing pages?

QuestionHi Kalena

Quick question.

One of the lessons in your SEO201 course says that if you run PPC campaigns and use landing pages for these campaigns that are similar in layout and content, you should prevent search engines robots from indexing them. Please explain why?

In my thoughts, the more files the search engines index the more exposure you may get.

Thanks
Alex

—————————————————————————————————

Hi Alex

PPC landing pages can often look nearly identical and the only difference between them is the target keywords used. Web pages that look too similar are sometimes filtered out of the search results as duplicate content. Too much duplicate content on a domain may impact it’s ability to rank highly, therefore I always recommend preventing robots from indexing landing pages, using your robots.txt file.

If you are using existing pages on your site as PPC landing pages and they aren’t too similar to each other, there is no need to block robots from indexing them. Make sense?

Share this post with others