Dumbass of the Week: Virgin Blue

Today, an unknown number of Virgin Blue Velocity members (including me) received an email titled “Surprise!- You’ve Turned Gold”.

The email announced that the recipient had been granted a free upgrade to Velocity Gold, the Gold level membership of Virgin Blue’s frequent flyer programme for a period of one year. The email went on to describe perks available to Gold members when travelling including:

  • Free Lounge membership, so you can catch up on work, relax and escape the airport crowds.
  • Priority check-in.
  • Up to 32kg of checked baggage at no cost.
  • Two personalised baggage tags

No reason for the unexpected upgrade was given, apart from “you came so close to making it on your own”, suggesting that the recipient’s Velocity points for the past 12 months came close to the number required to qualify for Gold status. Except they didn’t. At least not in my case. Not even close. You normally need to reach 50,000 points to qualify.

My husband received the same email, as did many others, judging by the discussion on Twitter shortly thereafter.

Still, it was a delightful surprise. Feeling chuffed, I click on a link within the email to an explanation of Velocity Gold. ONOZ. It led to a bizarre error message stating that the whole thing had been a terrible mistake:

Friday the 13th strike
Oops! Due to an error, you may have received an email regarding a Gold upgrade by mistake. Please disregard the free upgrade communication. We apologise for any inconvenience caused.

Virgin Blue is yet to offer any explanation for the error, apart from a single tweet blaming the email screw up on Friday the 13th. The backlash on Twitter so far has been brutal. Result? Gold Standard Marketing FAIL.

What do you think? Should Virgin Blue honour their original offer? Or is their Friday 13th excuse enough? Please add your comments below.

UPDATE 1: According to @bengrubb, Virgin Blue is blaming an IT glitch for the problem.

UPDATE 2: I created a new hash tag on Twitter for the incident called #velocitygate and it seems to have taken off.

UPDATE 3: Not sure when it went up, but the Velocity Rewards site now features an apology front and center of their home page. It’s a step in the right direction:

UPDATE 4: According to an article in the Sydney Morning Herald today (17 November), the email glitch was human error, pure and simple and accompanied by tears of panic as it was posted out to over 1 million recipients by mistake. No action will be taken by Australian Competition and Consumer Commission over the glitch and if Velocity members still have an issue, they are being advised to contact the airline directly. Drama over, move along please, nothing to see here.

Twitter and LinkedIn Do the Happy Dance

Social media darlings Twitter and LinkedIn announced a partnership this week that enables LinkedIn users to synch their status with Twitter updates.

Similar to the Twitter / Facebook integration, now when you set your status on LinkedIn you can now tweet it as well, to alert your followers on both LinkedIn AND Twitter, as well as real-time search services like Twitter Search and Bing. Simply edit your settings on LinkedIn to include a Twitter account to synch with and you’re good to go. LinkedIn have integrated a checkbox under the status field so your updates are automatically tweeted if you check the box.

Twitter users can also update their LinkedIn status from Twitter and Twitter clients, via the addition of the #in hashtag in a tweet.

Twitter founders Evan Williams and Biz Stone have likened the partnership to the “perfect combination” of peanut butter and chocolate. So I guess those of you who like Reese’s Peanut Butter Cups should be thrilled.

Q and A: Is it ok to use the noscript tag to add relevant keywords to a site?

QuestionDear Kalena…

On another site I ran into the use of noscript tag for SEO.

What do you think of the use of this if used to add keywords relevant to the site?

Thanks

Jena

Hi Jena,

For those who aren’t familiar with the noscript tag – despite great advancements in search technology over the years, search bots still have trouble reading content contained within scripted sections of web content (such as JavaScript, Flash etc.). To remedy this, the noscript tag was introduced so that web developers could display relevant content that the search bots and browsers that didn’t support scripting could see.

As you could imagine, it didn’t take long before the tag was exploited and used for keyword stuffing which resulted in penalties.

There are still a lot of positive ways to use this tag that will not result in penalties – such as embedding links if you have a script-based navigation structure and there’s certainly no harm in using it to include relevant content or keywords (as long as it matches the content contained within the scripted version of the page). But if you try and exploit this, you’ll end-up running into troubles.

However, if you’re looking to use this tag to improve the on-site SEO aspects of a website – my advice would be to create a script-free architecture (through the use of CSS) and work on creating content that will be visible to bots and users without the requirement of such tags. This will not only have a better impact on the SEO of your site, but also improve the functionality of your site across a wider variety of browsers and mobile devices (which commonly also have issues displaying scripted content).

Hope this helps

Peter Newsome
SiteMost Search Engine Optimisation

Google About to Caffeinate a Data Center

A few months back, Google announced they were planning a rollout of a new search engine called Caffeine.

Google engineers have been busy ensuring the new search infrastructure improves Google’s indexing speed and ability to scale as the web grows exponentially. To help in this quest, they made a web developer preview of Caffeine available in August and asked for public feedback. What wasn’t made clear by Google was exactly when the rollout would take place.

Google has ended speculation about that with their actions this week. Google has closed the Caffeine Sandbox from public viewing, replacing it with a message thanking testers for their feedback and announcing that Caffeine is *ready for a larger audience*.

Part of the message reads:

“Soon we will activate Caffeine more widely, beginning with one data center. This sandbox is no longer necessary and has been retired, but we appreciate the testing and positive input that webmasters and publishers have given.”

Seen any caffeinated Google SERPs yet? Let me know via comments below.

Q and A: Why doesn’t Google index my entire site?

Question

Dear Kalena…

I have been on the internet since 2006, I re-designed my site and for the past year it still has only indexed 16 pages out of 132.

Why doesn’t google index the entire site? I use a XML site map. I also wanted to know if leaving my old product pages up will harm my ratings. I have the site map setup to only index the new stuff and leave the old alone. I have also got the robots.txt file doing this as well. What should I do?

Jason

Hi Jason

I’ve taken a look at your site and I see a number of red flags:

  • Google hasn’t stored a cache of your home page. That’s weird. But maybe not so weird if you’ve stopped Google indexing your *old* pages.
  • I can’t find your robots.txt file. The location it should be in leads to a 404 page that contains WAY too many links to your product pages. The sheer number of links on that page and the excessive keyword repetition may have tripped a Googlebot filter. Google will be looking for your robots.txt file in the same location that I did.
  • Your XML sitemap doesn’t seem to contain links to all your pages. It should.
  • Your HTML code contains duplicate title tags. Not necessarily a problem for Google, but it’s still extraneous code.

Apart from those things, your comments above worry me. What do you mean by “old product pages”? Is the content still relevant? Do you still sell those products? If the answer is no to both, then remove them or 301 redirect them to replacement pages.

Why have you only set up your sitemap and robots.txt to index your new pages? No wonder Google hasn’t indexed your whole site. Googlebot was probably following links from your older pages and now it can’t. Your old pages contain links to your new ones right? So why would you deliberately sabotage the ability to have your new pages indexed? Assuming I’m understanding your actions correctly, any rankings and traffic you built up with your old pages have likely gone also.

Some general advice to fix the issues:

  • Run your site through the Spider Test to see how search engines index it.
  • Remove indexing restrictions in your robots-txt file and move it to the most logical place.
  • Add all your pages to your XML sitemap and change all the priority tags from 1  (sheesh!).
  • Open a Google Webmaster Tools account and verify your site. You’ll be able to see exactly how many pages of your site Google has indexed and when Googlebot last visited. If Google is having trouble indexing the site, you’ll learn about it and be given advice for how to fix it.
  • You’ve got a serious case of code bloat on your home page. The more code you have, the more potential indexing problems you risk. Shift all that excess layout code to a CSS file for Pete’s sake.
  • The number of outgoing links on your home page is extraordinary. Even Google says don’t put more than 100 links on a single page. You might want to heed that advice.