Q and A: Why is my client’s site no longer ranking in Google?

QuestionHi Kalena

I’ve been reading your articles and find your answers to many people very helpful. So, here is my issue.

I am helping a friend with his website that I built. I felt like we did a pretty decent job with SEO and we had some fairly high ranking in some key terms like “lasik in chicago” 6th and “lasik in Oakbrook” 2nd.

All of a sudden I was changing the index page to put up a larger flash video. I also added some additional text that looks similiar to some of the higher ranking sites that are competitors of my friend Dr. Sloane. Since then I have noticed he has been moved down to page three for the same ranking. When I went into Google Webmaster Tools, I noticed that it shows that Googlebot hasn’t accessed the homepage since 2007. Also, I see all my pages rank very low on PageRank.

I’m just a little bit confused and was hoping that you could give me a little advice on getting his site on the right track. He has been around on the net since mid 90’s, so the domain has some age.

Shannon

Hi Shannon

First of all, thank you for the caffeine donation, that helps a lot when I’m answering these questions in the wee hours. As for your issue, I’ve taken a look and wow, where do I start? How about here:

1) The first major content on your client’s home page HTML is a huge Flash file. Quite apart from the fact that it’s visually distracting and goes against every web site usability rule possible, you’ve stuck it right after the header tags, meaning it’s the first thing search engines are going to try and index. The file isn’t optimized so it doesn’t tell Googlebot and others anything about your page, it simply pushes the meatier content further down the code.

2) You seem to have some weird link to the iFrance site embedded in an iframe. What’s that about? It looks dodgy and search engines don’t like iframes so it’s probably triggered a red flag or two.

3) Your current home page looks and smells like a doorway page. There’s no obvious formatting, no navigation menu, the design is not consistent with the rest of the site and it doesn’t load properly in Firefox. I was half expecting to see user-agent sniffer code in the HTML, but perhaps it’s just really poor design.

4) We’re up to number 4 already, and this is probably your main problem: there seems to be some type of delayed meta refresh that kicks in after 5 seconds and redirects people to a different URL on the same domain. This is retro spam at it’s finest and is like waving a huge red flag at Google saying “HEY, I’M DOING SOMETHING DODGY OVER HERE! PENALIZE ME QUICK”

Spammers like to use meta refreshes in order to bait and switch i.e. show Googlebot a family safe DVD page like Driving Miss Daisy and then redirect human searchers to a porn site of the… ahem… same name. Ditch the redirect pronto. Decide which home page you want to show both users and search engines and stick with it.

Surprisingly, your Title and META tags check out ok, although there’s a bit of excessive keyword repetition in your META Keywords tag. Googlebot last cached your home page on 13 April so check your Webmaster Tools account again.

That’s it for now, I hate to say it but my coffee’s run out.


Like this post? Why not prove it? Please press the big feed button at the top left. Thanks!

Share this post with others

Cloaking by Major Aussie Travel Site Exposed at SMX Sydney

SEO ClinicIf you haven’t already heard the news, the final and highly amusing session at SMX Sydney revealed a not-so funny situation for large travel site Flight Centre Australia.

Late Friday afternoon, Rand Fishkin and Danny Sullivan were teamed up for the SEO Clinic session, where audience members offer their sites up for SEO advice to the three teams of two SEO experts each, fetchingly outfitted in doctor’s lab coats. During Rand and Danny’s review of the Discover Tasmania site, a standard code snippet grab to check for duplicate content in Google came back with some strange referrer link to this page. Straight dupe content right? Well not quite. A view source check didn’t show the phrase up anywhere in the HTML for that page. “Smells like cloaking!” said Danny excitedly as he shared his find with Rand live on stage.

Then followed the hilarious scenario where Danny tried to install the Firefox plug-in that allows code lookups using different user-agents. Lost windows, a non co-operating mouse and constant old married couple bickering between Danny and Rand kept the audience in stitches until they finally got the plug-in to install. Finally disguised as Googlebot, a quick look at the page code confirmed what Danny suspected: Flight Centre was indeed cloaking and using content from Discover Tasmania in the process.

With Google’s spam fighter Adam Lasnik sitting right there on the panel, it was only a matter of time before the consequences kicked in for Flight Centre. A search today for the same snippet used by Rand: “Tasmania’s capital lies in the south-east of the state” no longer brings up a Flight Centre page. Plus their subdomain catalogues.flightcentre.com.au shows no cache and a PageRank of zero. Looks like the whole sub-domain has been delisted – ouch! The moral of the story? Cloaking is easily detectable by both your competitors and anyone with a few programming smarts. The risks rarely outweigh the short term benefits.

Interestingly, Flight Centre didn’t attend SMX this year, but they were at a similar conference in Sydney last year and I recall having a conversation with the Marketing Manager about them seeking to hire some SEO young guns. Looks like whoever they hired shot themselves in the foot! Shame.

More information on cloaking and the session is available from Neerav’s blog. Photo courtesy of the SMX Sydney mozzers.

Share this post with others