Q and A: Does Ask.com Accept XML Sitemaps?

QuestionHi Kalena

I have uploaded my XML sitemap to Google, Yahoo and more recently Bing, thanks to your blog post about the Bing Webmaster Center.

However, I’m wondering if Ask.com accept XML sitemaps and if so, how do I upload mine to Ask?

thanks
Georgia

————————————–

Hello Georgia

Yes, Ask.com DO support XML Sitemap submissions. Here’s a blurb about it from their Webmaster Help area:

“Yes, Ask.com supports the open-format Sitemaps protocol. Once you have prepared a sitemap for your site, add the sitemap auto-discovery directive to robots.txt, or submit the sitemap file directly to us via the ping URL”

The ping URL is as follows:

http://submissions.ask.com/ping?sitemap=http%3A//www.yoursite.com/sitemap.xml

To add your sitemap to your robots.txt file, simply include this line:

Sitemap: http://www.yoursite.com/sitemap.xml

Actually it’s not just Ask that supports the addition of sitemaps in robots.txt. Did you know that both Google and Yahoo also support that method of sitemap delivery?

You can either submit your sitemap via the search engine’s appropriate submission interface (e.g. Google Webmaster Tools, Yahoo Site Explorer, Bing Webmaster Center) or specify your sitemap location in your robots.txt file as per the above instructions.

Share this post with others

Q and A: How important is domain canonicalization to SEO?

QuestionHi Kalena

I use a company that “specializes” in mortgage sites and hosting. Since I am in the process of applying everything I am learning, I saw fit to have my site graded by one of the many online tools available.

The tool showed that my site is coming up for both the www and non www versions of my domain. When I enquired with my host about doing a 301 for my domain to one version, they said

“There is nothing we or you can reset on the Xsites as this is beyond anything we have control over. We do not support any of this nor have the capability for any one else to have it”.

How much is it going to hurt me in SEO if I don’t get this fixed like the site grader suggested?

Alex

————————————–

Hi Alex

What you’re referring to here is domain canonicalization.

Search engines can sometimes index both www and non www versions of your domain, creating duplicate content headaches for you and also link popularity dilution. Therefore, it’s best for SEO purposes if you can stick with one version of your domain and make sure all links point to that version. The www version is my recommendation because most sites will link to you using that version anyway.

Judging by the response you got from your hosts, it sounds like they’re not familiar with the issue of domain canonicalization, which is concerning. If your site host won’t allow you to use a 301 to create a conditional redirect to your preferred version, you probably need to get a new host!

Alternatively, you can use the Canonical Link Element. You can also specify your preferred URL version in Google Webmaster Tools.

My blog post Does the canonicalization of my URL impact my search engine rankings? might also be of interest.

Share this post with others

An Overview of Bing Webmaster Center

Hands up those of you who have verified your sites with Google Webmaster Tools? Ok, good. Now keep your hands up if you’ve done the same for Yahoo Site Explorer? Hmmm a few hands dropped then.

Now keep your hands up if you’ve verified your site with Bing Webmaster Center? Oh dear.

Seems quite a few webmasters are concentrating on Google and forgetting about the other major search engines. If you want to understand how search engines interact with your site and find potential issues before they impact your traffic, you really need to verify your site and sitemaps with the big 3 and monitor your stats regularly.

Most people are familiar with Google Webmaster Tools and Yahoo Site Explorer, but today I want to give you a brief overview of Bing Webmaster Center.

To add a site to Bing Webmaster Center, simply login to your Bing account (or create a new one) and then type in a URL and a sitemap if you have one. You will be prompted to verify your site via either a meta verification tag you place in your home page header, or an XML file that you upload to your server.

Once you’ve verified your first site, you’ll see a dashboard that looks quite similar to Google Webmaster Tools, with the following tabs:

  • Summary – lists the date Bing last crawled your site, the number of indexed pages, your domain score and the top 5 pages of your site.
  • Profile – lists your URL, the verification process you used and the email address associated with your site.
  • Crawl Issues – lists any issues Bing discovered while crawling and indexing your site, such as 404 errors, malware infections and long dynamic URLs.
  • Backlinks – lists which webpages (including your own) are linking to your site.
  • Outbound Links – lists the web pages your site is linking to.
  • Keywords – allows you to see how your pages are performing in search results for specific keywords.
  • Sitemaps – provides various ways for you to notify MSNBot of new sitemaps or when you change an existing sitemap.

The following additional tools are available when you’re logged into Webmaster Center:

  • Robots.txt validator
  • HTTP verifier
  • Keyword research tool

So don’t ignore Bing Webmaster Center. Remember that Google is NOT the Internet.

Share this post with others

Google Now Helps You Improve Your Site Performance

A new addition in Webmaster Tools this week sees Google becoming your own personal usability and accessibility consultant.

Site Performance, an experimental feature added to the Webmaster Tools console courtesy of Google Labs, provides detailed information about your site’s load time and gives suggestions for speeding it up. It includes a chart of your site performance data over time, which can help determine latency triggers.

As explained in Google’s official blog post about it, the Site Performance console includes examples of specific pages and their actual page load times, plus Page Speed suggestions that can help reduce latency.

I was pretty shocked when I logged into Webmaster Tools today to find my blog pages take an average of 6 seconds to load. Google states that this is slower than 83% of sites! The Example Pages and Page Speed Suggestions revealed the culprit was a banner ad that was not optimized and a couple of extra DNA fetches on some pages so I was able to fix the issues pretty quickly.

The load time data is apparently sourced from aggregated information by users of the Google Toolbar but it’s important to remember that it’s all averaged. A specific user may experience your site faster or slower than the average depending on their location and network conditions.

As a Labs tool, Site Performance is still under development and Google are seeking feedback on it via the Webmaster Tools Forum.

Share this post with others

Q and A: Why doesn’t Google index my entire site?

Question

Dear Kalena…

I have been on the internet since 2006, I re-designed my site and for the past year it still has only indexed 16 pages out of 132.

Why doesn’t google index the entire site? I use a XML site map. I also wanted to know if leaving my old product pages up will harm my ratings. I have the site map setup to only index the new stuff and leave the old alone. I have also got the robots.txt file doing this as well. What should I do?

Jason

Hi Jason

I’ve taken a look at your site and I see a number of red flags:

  • Google hasn’t stored a cache of your home page. That’s weird. But maybe not so weird if you’ve stopped Google indexing your *old* pages.
  • I can’t find your robots.txt file. The location it should be in leads to a 404 page that contains WAY too many links to your product pages. The sheer number of links on that page and the excessive keyword repetition may have tripped a Googlebot filter. Google will be looking for your robots.txt file in the same location that I did.
  • Your XML sitemap doesn’t seem to contain links to all your pages. It should.
  • Your HTML code contains duplicate title tags. Not necessarily a problem for Google, but it’s still extraneous code.

Apart from those things, your comments above worry me. What do you mean by “old product pages”? Is the content still relevant? Do you still sell those products? If the answer is no to both, then remove them or 301 redirect them to replacement pages.

Why have you only set up your sitemap and robots.txt to index your new pages? No wonder Google hasn’t indexed your whole site. Googlebot was probably following links from your older pages and now it can’t. Your old pages contain links to your new ones right? So why would you deliberately sabotage the ability to have your new pages indexed? Assuming I’m understanding your actions correctly, any rankings and traffic you built up with your old pages have likely gone also.

Some general advice to fix the issues:

  • Run your site through the Spider Test to see how search engines index it.
  • Remove indexing restrictions in your robots-txt file and move it to the most logical place.
  • Add all your pages to your XML sitemap and change all the priority tags from 1  (sheesh!).
  • Open a Google Webmaster Tools account and verify your site. You’ll be able to see exactly how many pages of your site Google has indexed and when Googlebot last visited. If Google is having trouble indexing the site, you’ll learn about it and be given advice for how to fix it.
  • You’ve got a serious case of code bloat on your home page. The more code you have, the more potential indexing problems you risk. Shift all that excess layout code to a CSS file for Pete’s sake.
  • The number of outgoing links on your home page is extraordinary. Even Google says don’t put more than 100 links on a single page. You might want to heed that advice.
Share this post with others