Q and A: How do I migrate my site without risking my SEO and rankings?

QuestionHi Kalena

You optimised my website back in 2004 when we first launched the site – did an amazing job – site still coming up first page of Google 5 years later with some minor keyword tweaking now and again. The site is pretty dated now and am just about to relauch it.

My questions are: MY new CMS has global meta description/keywords as well as page specific ones available. Should I utilise both of these or will it be too much and negatively affect ranking? Also any advice on migrating sites – am a little nervous I will fall off Google under the new site, particularly as one of the most important and high ranking pages will have a different URL. Am planning on reusing existing meta data. Also new site contains flash. Any advice on use of this – alt tags?

Thanks very much for any advice.

Hilary

————————————————————–

Dear Hilary

My word, that’s a lot of questions! Great to hear my SEO efforts from 5 years ago are still paying dividends.

On to your questions. Firstly, you should only ever use one set of META tags per page. Google and most other engines will simply take the first set indexed and ignore others so you are simply contributing to code bloat to include more than one. Each page of your site should have a dedicated title and META tags individually optimized for the unique content on that page.

Secondly, you mention that one of the pages already ranking will have a new URL. Does that mean that all your other pages will retain their existing URLs? That is the ideal outcome but not always possible. For any high-ranking pages that will change URL, make sure you use 301 redirects to point the old URL to the new one so that Google and other engines know it has moved. That will also ensure that anyone clicking on existing links to the old URL (whether in search engines, directories or on other sites) will be automatically taken to the new page. My SiteProNews post about moving a site using 301 redirect should guide you.

Also remember to update your sitemaps in Google Webmaster Tools and Yahoo Site Explorer with your new pages.

Keep in mind that changing the structure of a page, let alone the content, will impact the way the page ranks. So your new CMS design may result in the page jumping or falling in the rankings. If the page content has been optimized for particular keywords, try to retain as much of the existing body text as you can. Use the opportunity of the redesign to optimize your existing pages for new keywords and optimize any new pages added.

Good luck!

Share this post with others

Q and A: How do I target different regional markets using keywords?

QuestionDear Kalena

I understand how to put keyword phrases together – I think, but with my target markets in the US, Australia and the UK, how do I target those markets? I assume I want those markets to see my page and not necessarily other countries.

How the heck do I do that?

Cliff

————————————————————–

Dear Cliff

It’s a tricky business targeting different regional markets using SEO. A couple of things to keep in mind:

1) Think about how users search differently in each market. Think about regional spelling and grammar uses. Research the market and the top ranking sites for your target keywords in those markets. Then build pages that are optimized for those regional search terms and patterns.

2) Make good use of Google’s Regional Location tool.

Hope this helps!

Share this post with others

Q and A: Do sitemap crawl errors hurt me in Google?

QuestionDear Kalena

I have a new site just built in late Sep 2008. I have it submitted to google and verified. Every week when it is crawled it comes up with the same errors.

I’ve been back to my designer multiple times and have done everything he has said to do and the errors still exist. These pages are not mine, they belong to a friend who had his site designed at the same place over a year ago.

My question is: Does this hurt me with google by continuing the same errors? If so, what can I do about it?

Thanks

Doug

————————————————————–

Dear Doug

No and nothing. Hope this helps!

Share this post with others

Q and A: Do regional domains constitute a duplicate content problem?

QuestionDear Kalena…

First of all I find the info on your site extremely useful -  I always look forward towards the newletter! I have been trying to find the time to do the SEO course but finding the time is always a problem! However, its still on my to do list.

I am trying to sort out a problem regarding duplicate content on my sites. We run local sites for each language/country we trade in (e.g. .fr for France and .co.uk for England). Unfortunately whilst growing the business I never had time to research SEO optimisation practices so I ended up with a lot of sites with the same duplicate content in them including title tags, descriptions etc. I had no idea how bad this was of course for organic ranking!

I have now created unique title tags and description for ALL the pages on ALL the sites. I have also changed the content into unique content for the home page and the paternity testing page (our main pages) for each site in English. The only site with complete unique content pages is .com and parts of .co.uk. For the rest of the pages that still have double content I have also put a NO INDEX, FOLLOW code on the pages that have duplicate content so that the spiders will not index the duplicate content pages. I did a FOLLOW as opposed to NO FOLLOW as I still want the internal links in the pages to be picked up – does this make sense ?

Also having made such changes how long does it normally take for Google to refresh its filters and starting ranking the site? The changes are now about a month old however the site is still not ranking.

Also should this not work – do you have any experience with submitting a re-consideration through the webmaster tools? What are the upside and downside of this?

Any advice would be greatly appreciated.

Regards
Kevin

Dear Kevin

Thanks for your coffee donation and I’m glad you like the newsletter. Now, about your tricky problem:

1) First up, take a chill pill. There’s no need to lodge a reinclusion request to Google. According to Google’s Site Status Tool, your main site is being indexed and hasn’t been removed from their datacenter results. A standard indexed page lookup shows 32 pages from your .com site have been indexed by Google, while a backward link lookup reveals at least 77 other sites are linking to yours. If you’ve put NoIndex tags on any dupe pages, you’ve covered yourself.

2) Next, pour yourself a drink and put your feet up. Your .fr site is also being indexed by Google, but there isn’t a dupe content issue because the site is in French, meaning that Googlebot sees the content as being completely different. Your .co.uk site is also being indexed by Google and again, there isn’t a dupe content issue because it looks like you have changed the content enough to ensure it doesn’t trip any duplicate content filters.

3) Now you’re relaxed, login to Google Webmaster Tools and make sure each of your domains are set to their appropriate regional search markets. To do this, click on each domain in turn and choose “Set Geographic Target” from the Tools menu. Your regional domains should already be associated with their geographic locations i.e. .co.uk should already be associated with the UK, meaning that Google will automatically be giving preference to your site in the SERPs shown to searchers in the UK. For your .com site, you can choose whether to associate it with the United States only (recommended as it is your main market), or not to use a regional association at all.

4) Now it’s time to do a little SEO clean up job on your HTML code. Fire or unfriend whoever told you to include all these unecessary META tags in your code:

  • Abstract
  • Rating
  • Author
  • Country
  • Distribution
  • Revisit-after

All these tags are un-supported by the major search engines and I really don’t know why programmers still insist on using them! All they do is clog up your code and contribute to excessive code bloat.

5) Finally, you need to start building up your site’s link popularity and boost your Google PageRank beyond the current 2 out of 10. And by link building, I mean the good old-fashioned type – seeking out quality sites in your industry and submitting your link request manually, NOT participating in free-for-all link schemes or buying text links on low quality link farms.

Good luck!

Share this post with others

Google Now Helps You Clean Up 404 Links

Google logoGoogle has just announced the easiest way to obtain inbound links to your site in a short space of time.

Webmaster Tool’s new Crawl Error Sources feature allows you to identify the sources of 404 Not Found errors that are found on your site. Listed next to “Crawl Errors” in the Webmaster Tools control panel, you’ll now find a “Linked From” column that lists the number of pages that link to a specific “Not found” URL on your site.  Clicking on an item in the “Linked From” column opens a separate dialog box which lists each page that links to this URL (both internal and external) along with the date it was discovered. You can even download all your crawl error sources to an Excel file.

If your webserver doesn’t comprehend 404s or fetch error pages very well, Google has also introduced a widget for Apache or IIS that consists of 14 lines of JavaScript that you can paste into your custom 404 page template to helps your visitors find what they’re looking. It provides suggestions based on the incorrect URL.

You can use the “Linked From” source information to fix the broken links in your site, place redirects to a more appropriate URL on your site and/or contact the webmasters linking to missing pages or using malformed links and ask them to fix the links.

Webmasters have been asking for something like this for a long time, so it’s a relief to see it live at last. The official post about the feature is on Google’s Webmaster Central Blog and Matt Cutts goes into more detail on his blog.

Share this post with others