An Overview of Bing Webmaster Center

Hands up those of you who have verified your sites with Google Webmaster Tools? Ok, good. Now keep your hands up if you’ve done the same for Yahoo Site Explorer? Hmmm a few hands dropped then.

Now keep your hands up if you’ve verified your site with Bing Webmaster Center? Oh dear.

Seems quite a few webmasters are concentrating on Google and forgetting about the other major search engines. If you want to understand how search engines interact with your site and find potential issues before they impact your traffic, you really need to verify your site and sitemaps with the big 3 and monitor your stats regularly.

Most people are familiar with Google Webmaster Tools and Yahoo Site Explorer, but today I want to give you a brief overview of Bing Webmaster Center.

To add a site to Bing Webmaster Center, simply login to your Bing account (or create a new one) and then type in a URL and a sitemap if you have one. You will be prompted to verify your site via either a meta verification tag you place in your home page header, or an XML file that you upload to your server.

Once you’ve verified your first site, you’ll see a dashboard that looks quite similar to Google Webmaster Tools, with the following tabs:

  • Summary – lists the date Bing last crawled your site, the number of indexed pages, your domain score and the top 5 pages of your site.
  • Profile – lists your URL, the verification process you used and the email address associated with your site.
  • Crawl Issues – lists any issues Bing discovered while crawling and indexing your site, such as 404 errors, malware infections and long dynamic URLs.
  • Backlinks – lists which webpages (including your own) are linking to your site.
  • Outbound Links – lists the web pages your site is linking to.
  • Keywords – allows you to see how your pages are performing in search results for specific keywords.
  • Sitemaps – provides various ways for you to notify MSNBot of new sitemaps or when you change an existing sitemap.

The following additional tools are available when you’re logged into Webmaster Center:

  • Robots.txt validator
  • HTTP verifier
  • Keyword research tool

So don’t ignore Bing Webmaster Center. Remember that Google is NOT the Internet.

Spread the joy!

Google Now Helps You Improve Your Site Performance

A new addition in Webmaster Tools this week sees Google becoming your own personal usability and accessibility consultant.

Site Performance, an experimental feature added to the Webmaster Tools console courtesy of Google Labs, provides detailed information about your site’s load time and gives suggestions for speeding it up. It includes a chart of your site performance data over time, which can help determine latency triggers.

As explained in Google’s official blog post about it, the Site Performance console includes examples of specific pages and their actual page load times, plus Page Speed suggestions that can help reduce latency.

I was pretty shocked when I logged into Webmaster Tools today to find my blog pages take an average of 6 seconds to load. Google states that this is slower than 83% of sites! The Example Pages and Page Speed Suggestions revealed the culprit was a banner ad that was not optimized and a couple of extra DNA fetches on some pages so I was able to fix the issues pretty quickly.

The load time data is apparently sourced from aggregated information by users of the Google Toolbar but it’s important to remember that it’s all averaged. A specific user may experience your site faster or slower than the average depending on their location and network conditions.

As a Labs tool, Site Performance is still under development and Google are seeking feedback on it via the Webmaster Tools Forum.

Spread the joy!

Q and A: Why doesn’t Google index my entire site?

Question

Dear Kalena…

I have been on the internet since 2006, I re-designed my site and for the past year it still has only indexed 16 pages out of 132.

Why doesn’t google index the entire site? I use a XML site map. I also wanted to know if leaving my old product pages up will harm my ratings. I have the site map setup to only index the new stuff and leave the old alone. I have also got the robots.txt file doing this as well. What should I do?

Jason

Hi Jason

I’ve taken a look at your site and I see a number of red flags:

  • Google hasn’t stored a cache of your home page. That’s weird. But maybe not so weird if you’ve stopped Google indexing your *old* pages.
  • I can’t find your robots.txt file. The location it should be in leads to a 404 page that contains WAY too many links to your product pages. The sheer number of links on that page and the excessive keyword repetition may have tripped a Googlebot filter. Google will be looking for your robots.txt file in the same location that I did.
  • Your XML sitemap doesn’t seem to contain links to all your pages. It should.
  • Your HTML code contains duplicate title tags. Not necessarily a problem for Google, but it’s still extraneous code.

Apart from those things, your comments above worry me. What do you mean by “old product pages”? Is the content still relevant? Do you still sell those products? If the answer is no to both, then remove them or 301 redirect them to replacement pages.

Why have you only set up your sitemap and robots.txt to index your new pages? No wonder Google hasn’t indexed your whole site. Googlebot was probably following links from your older pages and now it can’t. Your old pages contain links to your new ones right? So why would you deliberately sabotage the ability to have your new pages indexed? Assuming I’m understanding your actions correctly, any rankings and traffic you built up with your old pages have likely gone also.

Some general advice to fix the issues:

  • Run your site through the Spider Test to see how search engines index it.
  • Remove indexing restrictions in your robots-txt file and move it to the most logical place.
  • Add all your pages to your XML sitemap and change all the priority tags from 1  (sheesh!).
  • Open a Google Webmaster Tools account and verify your site. You’ll be able to see exactly how many pages of your site Google has indexed and when Googlebot last visited. If Google is having trouble indexing the site, you’ll learn about it and be given advice for how to fix it.
  • You’ve got a serious case of code bloat on your home page. The more code you have, the more potential indexing problems you risk. Shift all that excess layout code to a CSS file for Pete’s sake.
  • The number of outgoing links on your home page is extraordinary. Even Google says don’t put more than 100 links on a single page. You might want to heed that advice.
Spread the joy!

Q and A: Why are some directory backlinks not showing-up?

QuestionDear Kalena…

As I am submitting my site to directories as part of my link building campaign, I go back to some of the ones I submitted to 3 or 4 months ago and can see my info/link on these directories. However, when I check on Google WebMaster or Yahoo Site Explorer, these links are not listed as incoming links.

Can you tell me why that is?

Yen

Dear Yen…

There could be a number of reasons for this.

It’s not uncommon for some directories utilise the ‘nofollow’ tag, or use redirects or javascripts to link to your site. If any of these techniques have been applied, the link won’t be shown in the Webmaster / Site Explorer tools.

Some directories are very low quality, excessively employ reciprocal linking, sell links and/or link-out to bad neighbourhoods – if you’ve got a link from one of these directories, the link won’t pass any value and also may not be displayed in the link checking tools either.

Then there are the directories that are really big with hundreds/thousands of categories and sub-categories (and if those categories are not updated on a regular basis), then there’s also a very good chance that the spiders may not have crawled deeply enough to even find your link.

While submitting to a few of the more trusted directories isn’t a bad thing, my advice would be to adjust your link building approach.

Spend more time building relationships and trying to get a few really good, relevant links from well trusted sites instead of submitting to a bunch of average directories that most likely don’t pass much real link value.

Hope this helps!

Peter Newsome
SiteMost SEO Brisbane

Spread the joy!

Q and A: How do I migrate my site without risking my SEO and rankings?

QuestionHi Kalena

You optimised my website back in 2004 when we first launched the site – did an amazing job – site still coming up first page of Google 5 years later with some minor keyword tweaking now and again. The site is pretty dated now and am just about to relauch it.

My questions are: MY new CMS has global meta description/keywords as well as page specific ones available. Should I utilise both of these or will it be too much and negatively affect ranking? Also any advice on migrating sites – am a little nervous I will fall off Google under the new site, particularly as one of the most important and high ranking pages will have a different URL. Am planning on reusing existing meta data. Also new site contains flash. Any advice on use of this – alt tags?

Thanks very much for any advice.

Hilary

————————————————————–

Dear Hilary

My word, that’s a lot of questions! Great to hear my SEO efforts from 5 years ago are still paying dividends.

On to your questions. Firstly, you should only ever use one set of META tags per page. Google and most other engines will simply take the first set indexed and ignore others so you are simply contributing to code bloat to include more than one. Each page of your site should have a dedicated title and META tags individually optimized for the unique content on that page.

Secondly, you mention that one of the pages already ranking will have a new URL. Does that mean that all your other pages will retain their existing URLs? That is the ideal outcome but not always possible. For any high-ranking pages that will change URL, make sure you use 301 redirects to point the old URL to the new one so that Google and other engines know it has moved. That will also ensure that anyone clicking on existing links to the old URL (whether in search engines, directories or on other sites) will be automatically taken to the new page. My SiteProNews post about moving a site using 301 redirect should guide you.

Also remember to update your sitemaps in Google Webmaster Tools and Yahoo Site Explorer with your new pages.

Keep in mind that changing the structure of a page, let alone the content, will impact the way the page ranks. So your new CMS design may result in the page jumping or falling in the rankings. If the page content has been optimized for particular keywords, try to retain as much of the existing body text as you can. Use the opportunity of the redesign to optimize your existing pages for new keywords and optimize any new pages added.

Good luck!

Spread the joy!