Google’s Cross-Product Webinar

Google have announced a free cross-product webinar for webmasters to learn more about three of their most used products, Google Webmaster Tools, Google Analytics and Google Website Optimizer, and how they can work together to enhance your website.

The webinar will be held 8th July 2008, 9:00am PT (Pacific Time). To attend you need to register. Those that can’t make it will be able to access an archived version of the presentation via the same registration URL. This is the first time Google have offered a joint webinar for these products.

Share this post with others

Q and A: What is an XML Sitemap and why do I need one?

QuestionHi Kalena

I am not sure what a XML sitemap is. I have gone to websites that will automatically generate a site map and the code they create is not understandable to me and they can only index the first 500 pages.

There are pages on my site that are important to be indexed and others that don’t matter. I have no idea how to create a XML sitemap that only lists the pages I want indexed. How can I do this? Can you clarify what a XML sitemap is and if I can have only my important pages indexed on it?

Beverly

Hi Beverly

Thanks for the caffeine donation, I’ll be sure to use it tomorrow when I visit Starbucks.

A sitemap is simply a way for search engines and visitors to find all the pages on your site more easily. XML is simply a popular format for the delivery of the sitemap. To quote Sitemaps.org:

Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

I personally use XML Sitemaps to build all sitemaps for my own sites and my client’s sites. I paid for the stand alone version so I can create sitemaps for sites with over 500 pages. At under USD 20, I believe the price is pretty reasonable and their support is pretty good so it might be worth the investment for you. Apart from that, the instructions for using their web version are quite clear – perhaps you need to have a closer look? These sitemap FAQs shoud also help.

You can either create a full sitemap of your entire site and edit out any pages you don’t want indexed later, or instruct the generator to avoid certain files or sub-directories before running. Once you’ve created and downloaded the XML sitemap file for your site, simply upload it to your web server and follow the instructions to ensure it is indexed by search engines. If you’ve created a Google Webmaster Tools account, you can login and enter your sitemap URL directly into the control panel.

Like this post? Prove it! Please press the big feed button at the top left. Thanks.

Share this post with others

Q and A: Why is my client’s site no longer ranking in Google?

QuestionHi Kalena

I’ve been reading your articles and find your answers to many people very helpful. So, here is my issue.

I am helping a friend with his website that I built. I felt like we did a pretty decent job with SEO and we had some fairly high ranking in some key terms like “lasik in chicago” 6th and “lasik in Oakbrook” 2nd.

All of a sudden I was changing the index page to put up a larger flash video. I also added some additional text that looks similiar to some of the higher ranking sites that are competitors of my friend Dr. Sloane. Since then I have noticed he has been moved down to page three for the same ranking. When I went into Google Webmaster Tools, I noticed that it shows that Googlebot hasn’t accessed the homepage since 2007. Also, I see all my pages rank very low on PageRank.

I’m just a little bit confused and was hoping that you could give me a little advice on getting his site on the right track. He has been around on the net since mid 90′s, so the domain has some age.

Shannon

Hi Shannon

First of all, thank you for the caffeine donation, that helps a lot when I’m answering these questions in the wee hours. As for your issue, I’ve taken a look and wow, where do I start? How about here:

1) The first major content on your client’s home page HTML is a huge Flash file. Quite apart from the fact that it’s visually distracting and goes against every web site usability rule possible, you’ve stuck it right after the header tags, meaning it’s the first thing search engines are going to try and index. The file isn’t optimized so it doesn’t tell Googlebot and others anything about your page, it simply pushes the meatier content further down the code.

2) You seem to have some weird link to the iFrance site embedded in an iframe. What’s that about? It looks dodgy and search engines don’t like iframes so it’s probably triggered a red flag or two.

3) Your current home page looks and smells like a doorway page. There’s no obvious formatting, no navigation menu, the design is not consistent with the rest of the site and it doesn’t load properly in Firefox. I was half expecting to see user-agent sniffer code in the HTML, but perhaps it’s just really poor design.

4) We’re up to number 4 already, and this is probably your main problem: there seems to be some type of delayed meta refresh that kicks in after 5 seconds and redirects people to a different URL on the same domain. This is retro spam at it’s finest and is like waving a huge red flag at Google saying “HEY, I’M DOING SOMETHING DODGY OVER HERE! PENALIZE ME QUICK”

Spammers like to use meta refreshes in order to bait and switch i.e. show Googlebot a family safe DVD page like Driving Miss Daisy and then redirect human searchers to a porn site of the… ahem… same name. Ditch the redirect pronto. Decide which home page you want to show both users and search engines and stick with it.

Surprisingly, your Title and META tags check out ok, although there’s a bit of excessive keyword repetition in your META Keywords tag. Googlebot last cached your home page on 13 April so check your Webmaster Tools account again.

That’s it for now, I hate to say it but my coffee’s run out.


Like this post? Why not prove it? Please press the big feed button at the top left. Thanks!

Share this post with others

Q and A: Why do my search positions fluctuate so much in Google?

QuestionDear Kalena

I am hoping that you can help me as this has been driving me crazy! Certain search keywords such as “buy Taser” from the Index (home page) goes to page 1 in google for ONE day, then jumps to page 13 and climbs up to page 20 or so, then goes back to page 1 for one day. So, about once every two weeks those two keywords are on page one in Google for one day!

I do not have this problem with MSN. I am totally baffled. I was running Google Base thinking maybe there was a connection, but I inactivated the product search a few days ago, so I guess that is not the problem. The site is about 9 months old. Is it because I have a yearly expiration domain and Google thinks it will expire soon? I tried so many things. Please help ASAP. Thank you SO much!

Terri

Hi Terri

Ack, I seem to be a magnet for Yahoo SiteBuilder issues this week. First up, please read my recent rant about Yahoo SiteBuilder.

Secondly, to answer your question about shifting rankings, in a nutshell it’s because Google uses different datacenters to show results and shuffles between these datacenters (they each have slightly different ranking algorithms). So on one datacenter your site might be on page 1 for a search query but then for that same query done a day (or hour) later you might be on page 4, because they are showing results from a different storage facility.

Different pages from your site and your competitor’s sites might also be stored on different datacenters, meaning that pages that normally rank well may not appear at all depending on which datacenter Google is using to fetch search results and whether or not all your indexed pages are listed in that datacenter. Your competitors may have more pages indexed by Google across all datacenters so they seem to be consistently outranking you. Or else they have simply done a better job of optimizing their pages to match search queries.

But the datacenter issue is the least of your worries. Here are just some of the problems I see with your site:

  • The Title and META tags are poorly constructed and not optimized for performance on search engines. This can partly be blamed on the tag limitations of Yahoo SiteBuilder, but mostly it is just poor keyword choices and incorrect formatting. For example, your META Description tag contains a bunch of keywords instead of a readable sentence! Your poor Title and META tags are limiting the ability of each of your site pages being found in search engines.
  • You’ve got the worse case of code bloat that I’ve seen in years, thanks to excessive code added to your HTML pages by the SiteBuilder program and the page author. Code bloat happens when unnecessary code snippets are added to your HTML code during the editing of your pages. A very common way this happens is if you cut and paste text from one program into your web editing software. For example, if you cut and paste from MS Word into your web editor, you can often find extraneous code (such as span tags) added. These snippets build up and add to your file size and can often lead to invalid code, meaning that Googlebot and other search robots may have problems indexing your pages and abandon the site, meaning fewer pages are indexed and included in their datacenters. Apart from that, code bloat impacts the ranking relevancy of your site because it impacts the keyword density of your pages. For example, if your competitor mentions “buy taser” in their page text the same number of times as you do in your page, but their page has less code to wade through, it is likely that their page will rank higher than yours for the search query “buy taser”.
  • As I suspected, your code does not validate. When I ran it through the W3C Markup Validator, it spat back 236 errors, including a missing DocType! Now Google and other engines are pretty forgiving these days when it comes to invalid code, but even if some pages are being successfully indexed, the errors could well be sabotaging your site’s ability to rank well.
  • I’m betting that Google hasn’t indexed many of your site pages. Read this post about how to get more pages indexed and how to monitor your site’s performance in Google.
  • You’ve got keyword repetition ad infinitum happening on your home page. The excessive keyword reps are almost certainly going to trigger search engine spam filters if they haven’t already. I think you’re breaking practically all Google’s Webmaster Guidelines!
  • There don’t seem to be many internal or external links pointing to your site. You should try to gain some links from other web sites in your industry as theme-based links will help boost your position in Google.

There’s lots more wrong with your site, but I think I’ve given you plenty to get on with for your $10 coffee donation. The rest is up to you. As I always say to businesses using free or cheap web design and hosting tools online, you get what you pay for. If you want potential customers to take your business seriously, YOU need to take it seriously and spend some time and money addressing your site’s compatibility with search engines.

You should consider paying a site designer to build you a better looking site that can be properly optimized. If you can’t afford a professional site design, consider installing the (free) WordPress blogging platform on your server and taking full control over your site that way. Teach yourself – it’s free – and then hire a search engine optimizer to get your site ranking better. If you can’t afford a search engine optimizer, consider posting your requirements on our Search Engine College jobs board as there are a lot of SEO students just itching to sharpen their skills on a real site.

I’d also recommend downloading the free Search Engine Optimization lesson from Search Engine College so you can better understand what makes a site rank well in search engines and take control of your own site’s destiny. Good luck!

Share this post with others

Q and A: How do I get Google to index more pages on my site?

QuestionDear Kalena…

I am currently reaching #1 in Google but only for two pages of my site. My question is how do I get more than two pages to return ranking? I don’t believe Google is indexing more than the two pages. How can I change this?

Gloria

Dear Gloria

Easy peasy. You need to create an XML sitemap of all your web pages and upload it to the Google Webmaster Tools area. Once you set up a Webmaster Tools account, you’ll be able to keep track of how many pages Google is indexing, what indexing issues, if any, Googlebot strikes and other useful statistics about your site’s visibility in the Google index.
But getting your pages ranking towards the top of the search results is more tricky and will require you to learn and implement some SEO tactics. You should also consider improving your site’s link popularity by obtaining more incoming links from relevant, trusted sites, particularly those in the same industry. This will help boost your rankings for industry-specific keywords and phrases.

Share this post with others