SMX Sydney 2010: Search Engine Penalties

Greg Grothaus from Google is talking now about Search Engine Penalties.

Yes, Google promotes some web sites and demotes others. We sometimes penalize web sites for spam, but most of the penalties we apply relate to malware and spyware etc. rather than spam.

Greg says they update the ranking algo 500+ times a year and that number is rising.

What is NOT Spam?

– improvements to a site’s architecture

– adding relevant keywrods

– offering high quality info

– normal pr and marketing

– editorial linking to pages

What IS Spam?

Sites Positioned Above Mine

– Actually, it’s anything that violates Google’s quality guidelines. Greg then defined Black vs White hat.

“Make pages primarily for users, not for search engines”. “Does this help my users? Would I do this if search engines didn’t exist?”

Spam According to Google:

– Hidden text/ links

– sneaky redirects

– schemes to artificially boost links

– off topic kws or kw stuffing

– duplicate content

– misleading content

Examples to Avoid

1) Cloaking –  IP delivery is acceptable if you’re delivering different information to different users. If you’re delivering different info to Googlebot vs real people, then you’re in trouble. If users FROM Google see same thing as Google, then chances are you’ll be ok.

2) Sneaky Redirects –

3) Mad Lib Spam – doorway pages, keywords stuffing

4) Poor Quality Links –

5) Paid Links – Google is ridiculously good at  spotting paid links.

6) Using Links – people linking to you is ok. Paid advertising links is ok. But make sure the links help uses and are relevant.

Bad links don’t have quite the bad effect on ranking that people think they do.

Google now alerts webmasters via W/M Tools if they are being penalized / banned. Even if you don’t receive that email, if you *think* you might be penalized, feel free to submit a *reinclusion request* via W/M tools. Google staff will take a look and make a decision. DON’T submit your reconsideration request until you’ve fixed the probs you think are causing the penalty.

Preventing Spam on Your Site

– hacked sites

– comment spam

– link injection

These lead to:

– unhappy users

– bad reputation

– lost traffic

– dangerous malware

Preventing spam on your site

– Use a Captcha e.g. kitten Captcha or 3+5=?

– Avoid hacking. Already compromised? Here’s how to tell:

1) Try a query like: au viagra

2) Look at your pages with CSS and Javascript disabled for any hidden content

3) Use Google’s W/M tools to see which queries users use to find your site.

Check your analytics for unusual queries – look at your W/M tools dashboard for top kws. If you’ve been hacked, you might see random queries like *viagra* there.

Preventative Measures

– keep your 3rd party software up to date with patched version

– use secure passwords

– if fixing a compromised site, do a clean sweep, reinstalling everythying from scratch and remove any backdoors

– escape all inputs you accept through web forms to avoid SQL injection and XSS attacks

– Skipfish web application security scanner – new automated tool from Google

– GreenSQL database firewall

Greg said that you should use Rel=canonical where possible, provided you don’t overdo it and have hundreds of pages redirecting to a single URL.

Brent told the story about how his link network was penalized and he had to wait until the issue was discussed in Webmaster World and a Reinclusion Request was submitted before the penalty was listed.

Greg stated that affiliate links shouldn’t be impacting a site too much in terms of penalties.

Somebody asked Greg about the loss of PageRank via 301 redirects, quoting Matt Cutts stating there is some loss of PR. Greg made the point that it may just be that it takes googlebot time to follow the 301s and re-index the site, so you may just be witnessing real time PR lag.

Spread the joy!