Q and A: Does Ask.com Accept XML Sitemaps?

QuestionHi Kalena

I have uploaded my XML sitemap to Google, Yahoo and more recently Bing, thanks to your blog post about the Bing Webmaster Center.

However, I’m wondering if Ask.com accept XML sitemaps and if so, how do I upload mine to Ask?

thanks
Georgia

————————————–

Hello Georgia

Yes, Ask.com DO support XML Sitemap submissions. Here’s a blurb about it from their Webmaster Help area:

“Yes, Ask.com supports the open-format Sitemaps protocol. Once you have prepared a sitemap for your site, add the sitemap auto-discovery directive to robots.txt, or submit the sitemap file directly to us via the ping URL”

The ping URL is as follows:

http://submissions.ask.com/ping?sitemap=http%3A//www.yoursite.com/sitemap.xml

To add your sitemap to your robots.txt file, simply include this line:

Sitemap: http://www.yoursite.com/sitemap.xml

Actually it’s not just Ask that supports the addition of sitemaps in robots.txt. Did you know that both Google and Yahoo also support that method of sitemap delivery?

You can either submit your sitemap via the search engine’s appropriate submission interface (e.g. Google Webmaster Tools, Yahoo Site Explorer, Bing Webmaster Center) or specify your sitemap location in your robots.txt file as per the above instructions.

Share this post with others

Q and A: Why aren’t our franchisee websites being found in search results?

Question

Hi Kalena,

I have just encountered something I am not sure about and I really need some advice on this. The site I am working on has the following issue;

It is a business with 100 franchises. The franchisees are complaining they do not come up in any searches. I have checked it and they don’t. Not even when you type in their exact URL into the search engine.

The URL structure for the business’s franchises work like this;
www.clientsite.com/studio/location (actual URL provided)

A related problem may be that there are 3 separate XML sitemaps:
1) www.clientsite.com/sitemap/sitemap.xml
2) www.clientsite.com/sitemap/location(Alpha)sitemap.xml
3) www.clientsite.com/sitemap/location(postcodes)sitemap.xml

The first is their MAIN sitemap. The other two are sitemaps for all the locations of their franchises (100 in total) These locations and their URLS are not included in the MAIN sitemap. Is having multiple sitemaps detrimental to the SEO.?

Yen

Hi Yen,

You may be surprised, but this is a VERY common issue for franchise websites that are based on a template structure, and you’ll realise that the reason the franchisee pages are not being found in search results is actually pretty simple… But first, I’ll address your sitemap query.

Multiple Sitemaps

Using multiple sitemaps is not the problem here.  If you do a search for  site:clientsite.com in Google you will see that the pages in question are actually indexed – which means that the search engines have found and crawled them.

I think though that it is probably unnecessary for your site (with just a couple of thousand pages) to have multiple sitemaps.  Multiple sitemaps are recommended (and in fact required) for very large sites, but there is a specific protocol involving a sitemaps index file (that you do not seem to be using).  You can find out more about it, with clear instructions and examples on how to correctly use sitemaps at sitemaps.org.

So the issue with your site is not indexing – it is ranking.  You don’t specify what search queries you would hope/expect the pages to be found for, but for all the examples I tried, the franchisees pages did come up for a query of their business name itself – which is more evidence that the pages are indexed OK.  From what I could see, all your franchisees seem to have a single page  of content – based on a standard template, with just the business name and contact details changed.  So in effect each franchisees page is one of 100 essentially “identical” pages on the site.

Website Templates

This is a clear issue of duplicate content which is very common for franchise sites based upon standard templates (which provide templated content rather than just the structure or design).  In this instance, each franchisee has just a single page within the same root domain (1 of 100 almost identical pages), with relatively little keyword rich content, so I am not surprised (and neither should you be) that it does not rank at all for general keyword phrases.  In fact if each franchisee had their own individual domains, with multiple pages of optimised keyword rich content – if they were based on the same template, they still would not rank any better.

I get asked about this type of issue a lot.  Excited and enthusiastic new franchisees (and multi level marketers) have setup their website using a template provided by “the business” and pretty soon begin to wonder why the eagerly anticipated enquiries and sales aren’t flooding in from their websites.

Quality, Keyword Rich, Unique Content

One of the very first things that most SEOs learn is that to get good rankings you need quality, keyword rich and UNIQUE content.  Using a templated approach is clearly NOT a strategy you should follow to get unique content.  For a graphic example try this search query : “incalculable numbers of real people”  – which is snippet of text taken from a website template for a well known international “We are Not Multi Level Marketing” organisation (probably not the one you are thinking of).

The above, fairly specific, and you might expect, “unique” query returns over 40,000 results. Is it any wonder that most of these sites will never be found through organic search?

That’s not to say that there is no value in these templated systems – many have been setup to very cleverly guide people through to the signup process – but if you “own” one of these sites you will need to use other methods to get traffic to it (PPC, Advertising, etc) and not rely on organic search traffic.

So Yen,  back to your question… If your franchisees want to be found for generic keyword searches, I suggest that they register their own domains, and create their own unique, keyword rich content rather than depending on the corporate “template”.

Andy Henderson
WebConsulting

Share this post with others

Q and A: Why doesn’t Google index my entire site?

Question

Dear Kalena…

I have been on the internet since 2006, I re-designed my site and for the past year it still has only indexed 16 pages out of 132.

Why doesn’t google index the entire site? I use a XML site map. I also wanted to know if leaving my old product pages up will harm my ratings. I have the site map setup to only index the new stuff and leave the old alone. I have also got the robots.txt file doing this as well. What should I do?

Jason

Hi Jason

I’ve taken a look at your site and I see a number of red flags:

  • Google hasn’t stored a cache of your home page. That’s weird. But maybe not so weird if you’ve stopped Google indexing your *old* pages.
  • I can’t find your robots.txt file. The location it should be in leads to a 404 page that contains WAY too many links to your product pages. The sheer number of links on that page and the excessive keyword repetition may have tripped a Googlebot filter. Google will be looking for your robots.txt file in the same location that I did.
  • Your XML sitemap doesn’t seem to contain links to all your pages. It should.
  • Your HTML code contains duplicate title tags. Not necessarily a problem for Google, but it’s still extraneous code.

Apart from those things, your comments above worry me. What do you mean by “old product pages”? Is the content still relevant? Do you still sell those products? If the answer is no to both, then remove them or 301 redirect them to replacement pages.

Why have you only set up your sitemap and robots.txt to index your new pages? No wonder Google hasn’t indexed your whole site. Googlebot was probably following links from your older pages and now it can’t. Your old pages contain links to your new ones right? So why would you deliberately sabotage the ability to have your new pages indexed? Assuming I’m understanding your actions correctly, any rankings and traffic you built up with your old pages have likely gone also.

Some general advice to fix the issues:

  • Run your site through the Spider Test to see how search engines index it.
  • Remove indexing restrictions in your robots-txt file and move it to the most logical place.
  • Add all your pages to your XML sitemap and change all the priority tags from 1  (sheesh!).
  • Open a Google Webmaster Tools account and verify your site. You’ll be able to see exactly how many pages of your site Google has indexed and when Googlebot last visited. If Google is having trouble indexing the site, you’ll learn about it and be given advice for how to fix it.
  • You’ve got a serious case of code bloat on your home page. The more code you have, the more potential indexing problems you risk. Shift all that excess layout code to a CSS file for Pete’s sake.
  • The number of outgoing links on your home page is extraordinary. Even Google says don’t put more than 100 links on a single page. You might want to heed that advice.
Share this post with others

Q and A: Does Google automatically search sub-directories?

QuestionDear Kalena…

Does Google automatically search sub-directories? Or do I have to have a ‘Links’ page to force google to index the sub-directories?

Also, I was reading about ‘redundant’ content. I have a business directory which will eventually have thousands of pages with the only main difference in content being: {Company} {City} {ST} and {Subject1}. Will Google view this as redundant content?

Best Regards,

Steve

Dear Steve,

For Google to index your sub-directories, you will need some links pointing to them. These links can simply be internal navigation links and if you have a large website, it’s also advisable to include a sitemap that links to all your pages and sub-directories within your site.

In regards to your redundant content query – it’s best SEO practice to have at least 250 words of unique content per page. So if all the pages are the same other than the contact details – then yes, it would be considered redundant content.

My advice would be to offer a one-page listing for each company and on that page have a small blurb about the company, their contact details and a feature that allow users to add feedback/comments/reviews. This should provide enough information for Google to index without causing redundant or duplicate content issues.

Hope this helps!

Peter Newsome
SiteMost

Share this post with others

Q and A: Do sitemap crawl errors hurt me in Google?

QuestionDear Kalena

I have a new site just built in late Sep 2008. I have it submitted to google and verified. Every week when it is crawled it comes up with the same errors.

I’ve been back to my designer multiple times and have done everything he has said to do and the errors still exist. These pages are not mine, they belong to a friend who had his site designed at the same place over a year ago.

My question is: Does this hurt me with google by continuing the same errors? If so, what can I do about it?

Thanks

Doug

————————————————————–

Dear Doug

No and nothing. Hope this helps!

Share this post with others