Q and A: Why doesn’t Google index my entire site?

Question

Dear Kalena…

I have been on the internet since 2006, I re-designed my site and for the past year it still has only indexed 16 pages out of 132.

Why doesn’t google index the entire site? I use a XML site map. I also wanted to know if leaving my old product pages up will harm my ratings. I have the site map setup to only index the new stuff and leave the old alone. I have also got the robots.txt file doing this as well. What should I do?

Jason

Hi Jason

I’ve taken a look at your site and I see a number of red flags:

  • Google hasn’t stored a cache of your home page. That’s weird. But maybe not so weird if you’ve stopped Google indexing your *old* pages.
  • I can’t find your robots.txt file. The location it should be in leads to a 404 page that contains WAY too many links to your product pages. The sheer number of links on that page and the excessive keyword repetition may have tripped a Googlebot filter. Google will be looking for your robots.txt file in the same location that I did.
  • Your XML sitemap doesn’t seem to contain links to all your pages. It should.
  • Your HTML code contains duplicate title tags. Not necessarily a problem for Google, but it’s still extraneous code.

Apart from those things, your comments above worry me. What do you mean by “old product pages”? Is the content still relevant? Do you still sell those products? If the answer is no to both, then remove them or 301 redirect them to replacement pages.

Why have you only set up your sitemap and robots.txt to index your new pages? No wonder Google hasn’t indexed your whole site. Googlebot was probably following links from your older pages and now it can’t. Your old pages contain links to your new ones right? So why would you deliberately sabotage the ability to have your new pages indexed? Assuming I’m understanding your actions correctly, any rankings and traffic you built up with your old pages have likely gone also.

Some general advice to fix the issues:

  • Run your site through the Spider Test to see how search engines index it.
  • Remove indexing restrictions in your robots-txt file and move it to the most logical place.
  • Add all your pages to your XML sitemap and change all the priority tags from 1  (sheesh!).
  • Open a Google Webmaster Tools account and verify your site. You’ll be able to see exactly how many pages of your site Google has indexed and when Googlebot last visited. If Google is having trouble indexing the site, you’ll learn about it and be given advice for how to fix it.
  • You’ve got a serious case of code bloat on your home page. The more code you have, the more potential indexing problems you risk. Shift all that excess layout code to a CSS file for Pete’s sake.
  • The number of outgoing links on your home page is extraordinary. Even Google says don’t put more than 100 links on a single page. You might want to heed that advice.
Spread the joy!

Q and A: Why are some directory backlinks not showing-up?

QuestionDear Kalena…

As I am submitting my site to directories as part of my link building campaign, I go back to some of the ones I submitted to 3 or 4 months ago and can see my info/link on these directories. However, when I check on Google WebMaster or Yahoo Site Explorer, these links are not listed as incoming links.

Can you tell me why that is?

Yen

Dear Yen…

There could be a number of reasons for this.

It’s not uncommon for some directories utilise the ‘nofollow’ tag, or use redirects or javascripts to link to your site. If any of these techniques have been applied, the link won’t be shown in the Webmaster / Site Explorer tools.

Some directories are very low quality, excessively employ reciprocal linking, sell links and/or link-out to bad neighbourhoods – if you’ve got a link from one of these directories, the link won’t pass any value and also may not be displayed in the link checking tools either.

Then there are the directories that are really big with hundreds/thousands of categories and sub-categories (and if those categories are not updated on a regular basis), then there’s also a very good chance that the spiders may not have crawled deeply enough to even find your link.

While submitting to a few of the more trusted directories isn’t a bad thing, my advice would be to adjust your link building approach.

Spend more time building relationships and trying to get a few really good, relevant links from well trusted sites instead of submitting to a bunch of average directories that most likely don’t pass much real link value.

Hope this helps!

Peter Newsome
SiteMost SEO Brisbane

Spread the joy!

Q and A: How do I migrate my site without risking my SEO and rankings?

QuestionHi Kalena

You optimised my website back in 2004 when we first launched the site – did an amazing job – site still coming up first page of Google 5 years later with some minor keyword tweaking now and again. The site is pretty dated now and am just about to relauch it.

My questions are: MY new CMS has global meta description/keywords as well as page specific ones available. Should I utilise both of these or will it be too much and negatively affect ranking? Also any advice on migrating sites – am a little nervous I will fall off Google under the new site, particularly as one of the most important and high ranking pages will have a different URL. Am planning on reusing existing meta data. Also new site contains flash. Any advice on use of this – alt tags?

Thanks very much for any advice.

Hilary

————————————————————-

Dear Hilary

My word, that’s a lot of questions! Great to hear my SEO efforts from 5 years ago are still paying dividends.

On to your questions. Firstly, you should only ever use one set of META tags per page. Google and most other engines will simply take the first set indexed and ignore others so you are simply contributing to code bloat to include more than one. Each page of your site should have a dedicated title and META tags individually optimized for the unique content on that page.

Secondly, you mention that one of the pages already ranking will have a new URL. Does that mean that all your other pages will retain their existing URLs? That is the ideal outcome but not always possible. For any high-ranking pages that will change URL, make sure you use 301 redirects to point the old URL to the new one so that Google and other engines know it has moved. That will also ensure that anyone clicking on existing links to the old URL (whether in search engines, directories or on other sites) will be automatically taken to the new page. My SiteProNews post about moving a site using 301 redirect should guide you.

Also remember to update your sitemaps in Google Webmaster Tools and Yahoo Site Explorer with your new pages.

Keep in mind that changing the structure of a page, let alone the content, will impact the way the page ranks. So your new CMS design may result in the page jumping or falling in the rankings. If the page content has been optimized for particular keywords, try to retain as much of the existing body text as you can. Use the opportunity of the redesign to optimize your existing pages for new keywords and optimize any new pages added.

Good luck!

Spread the joy!

Q and A: How do I target different regional markets using keywords?

QuestionDear Kalena

I understand how to put keyword phrases together – I think, but with my target markets in the US, Australia and the UK, how do I target those markets? I assume I want those markets to see my page and not necessarily other countries.

How the heck do I do that?

Cliff

————————————————————-

Dear Cliff

It’s a tricky business targeting different regional markets using SEO. A couple of things to keep in mind:

1) Think about how users search differently in each market. Think about regional spelling and grammar uses. Research the market and the top ranking sites for your target keywords in those markets. Then build pages that are optimized for those regional search terms and patterns.

2) Make good use of Google’s Regional Location tool.

Hope this helps!

Spread the joy!

Q and A: Do sitemap crawl errors hurt me in Google?

QuestionDear Kalena

I have a new site just built in late Sep 2008. I have it submitted to google and verified. Every week when it is crawled it comes up with the same errors.

I’ve been back to my designer multiple times and have done everything he has said to do and the errors still exist. These pages are not mine, they belong to a friend who had his site designed at the same place over a year ago.

My question is: Does this hurt me with google by continuing the same errors? If so, what can I do about it?

Thanks

Doug

————————————————————-

Dear Doug

No and nothing. Hope this helps!

Spread the joy!