SMX Sydney 2010: Search Engine Penalties

Greg Grothaus from Google is talking now about Search Engine Penalties.

Yes, Google promotes some web sites and demotes others. We sometimes penalize web sites for spam, but most of the penalties we apply relate to malware and spyware etc. rather than spam.

Greg says they update the ranking algo 500+ times a year and that number is rising.

What is NOT Spam?

– improvements to a site’s architecture

– adding relevant keywrods

– offering high quality info

– normal pr and marketing

– editorial linking to pages

What IS Spam?

Sites Positioned Above Mine

– Actually, it’s anything that violates Google’s quality guidelines. Greg then defined Black vs White hat.

“Make pages primarily for users, not for search engines”. “Does this help my users? Would I do this if search engines didn’t exist?”

Spam According to Google:

– Hidden text/ links

– sneaky redirects

– schemes to artificially boost links

– off topic kws or kw stuffing

– duplicate content

– misleading content

Examples to Avoid

1) Cloaking –  IP delivery is acceptable if you’re delivering different information to different users. If you’re delivering different info to Googlebot vs real people, then you’re in trouble. If users FROM Google see same thing as Google, then chances are you’ll be ok.

2) Sneaky Redirects –

3) Mad Lib Spam – doorway pages, keywords stuffing

4) Poor Quality Links –

5) Paid Links – Google is ridiculously good at  spotting paid links.

6) Using Links – people linking to you is ok. Paid advertising links is ok. But make sure the links help uses and are relevant.

Bad links don’t have quite the bad effect on ranking that people think they do.

Google now alerts webmasters via W/M Tools if they are being penalized / banned. Even if you don’t receive that email, if you *think* you might be penalized, feel free to submit a *reinclusion request* via W/M tools. Google staff will take a look and make a decision. DON’T submit your reconsideration request until you’ve fixed the probs you think are causing the penalty.

Preventing Spam on Your Site

– hacked sites

– comment spam

– link injection

These lead to:

– unhappy users

– bad reputation

– lost traffic

– dangerous malware

Preventing spam on your site

– Use a Captcha e.g. kitten Captcha or 3+5=?

– Avoid hacking. Already compromised? Here’s how to tell:

1) Try a query like: site.example.com. au viagra

2) Look at your pages with CSS and Javascript disabled for any hidden content

3) Use Google’s W/M tools to see which queries users use to find your site.

Check your analytics for unusual queries – look at your W/M tools dashboard for top kws. If you’ve been hacked, you might see random queries like *viagra* there.

Preventative Measures

– keep your 3rd party software up to date with patched version

– use secure passwords

– if fixing a compromised site, do a clean sweep, reinstalling everythying from scratch and remove any backdoors

– escape all inputs you accept through web forms to avoid SQL injection and XSS attacks

– Skipfish web application security scanner – new automated tool from Google

– GreenSQL database firewall

Greg said that you should use Rel=canonical where possible, provided you don’t overdo it and have hundreds of pages redirecting to a single URL.

Brent told the story about how his link network was penalized and he had to wait until the issue was discussed in Webmaster World and a Reinclusion Request was submitted before the penalty was listed.

Greg stated that affiliate links shouldn’t be impacting a site too much in terms of penalties.

Somebody asked Greg about the loss of PageRank via 301 redirects, quoting Matt Cutts stating there is some loss of PR. Greg made the point that it may just be that it takes googlebot time to follow the 301s and re-index the site, so you may just be witnessing real time PR lag.

SMX Sydney 2010: Dealing with Domains, Parameters and URLs

Brent Payne of Chicago TribuneNext up is Brent Payne of the Chicago Tribune. Brent trains hundreds of journalists each year to make sure they understand how to write with search engines in mind.

Search Engine Ranking Factors, breaks down to :

Popularity

Authority

Relevance

On page/offpage/linktext/URL = main categories for ranking

So Brent reminds us there is no penalty, only a filter. Your first *penalty* is not appearing on the first page of Google!

Parameter Handling

– Use GG W/M Tools – has parameter handling option in Webmaster Tools now.

– 301 Redirects

Brent says the Tribune has a tool to do 301s/canonicals automatically. They have 50 different domains with 50 different goals so sometimes there will be duplicate content. That can cause issues for Google and so instead of 1 article being filtered, ALL articles are filtered out. To fix that, canonical tags should’ve been used.

Remember, says Brent, the pages need to be very similar for canonicals.

Sub Domains / Sub Directories

Via Matt Cutt’s blog, Google says not a huge difference, except if targeting different countries – then use sub-domains (Brent showed example question I asked in the comments!)

Domain Authority

Subdomains on Separate IP Blocks

Brent asked Google whether separate IP blocks make a difference to ranking. Response was “LOL Brent, that’s a new level of obsessing”. Brent found that quite interesting.

SMX Sydney 2010: Duplicate Content

Up now is Todd Friersen of Position Tech to talk about duplicate content.Todd Frierson

Todd is a *reformed* black hat spammer. Duplicate content was the standard practice back in those days when he used to do SEO for viagra and other pharma web sites. Build 100 websites and slap up duplicate content.

Todd makes the point that duplicate content can happen easily, even with your home page. Showed 5 or 6 examples of homepage and web server configuration issues. Google *may* decide on the right version of the home page, but you should really tell them which one if you can.

Rel=canonical / 301 redirect = your friend to solve this issue

Faceted Navigation

– products exist in many categories e.g. Tigerproducts.com uses it, Dell uses it

– categories are flexible and unordered

– results are in crazy amounts of duplicate content

– problem. Web site ends up with 200K products but 4 million URLs

No need to do worry, Todd says. Create a directory structure that encourages Googlebot to come in via a specific way, but block all the dupe pages out of Google’s index.

Regional Domains

– AU, US, CA, UK etc.

– country.example.com

– example.com/country

– example.com/index.htm?lang=en

– country specific TLDs

This is easy to resolve, says Todd. You can use Rel=canonical or simply login to GG Webmaster Tools and tell Google what country your domain, sub-domain, or folder is associated with. Do this!

Multiple Sites and Microsites

– Keyword Domains. Bleurgh

– Why are you doing this?

– Stop it. Stop it now.

– Consolidate your sites and your effort (and this will concentrate all your link popularity to one site as well).

– Actually, Bing likes microsites. If you have to do it, do it for Bing. They love it.

SMX Sydney 2010: Bulletproof Link Building

Greg Boser from 3DogMedia is now up to talk about creating a bulletproof link strategy.Greg Boser of 3DogMedia

Step 1: Link Map / Blueprint

Greg’s team creates a Link Map Profile. Greg says analyse the top sites in your space and work out the job ahead

Things to look at during this stage:

– average number of links

– average age of competing sites

– average PR of competing sites – will determine how hard it’s going to be to compete. Also take into account the unique IPs.

– average number of linking domains

– average age of linking domains

– average PR distribution of incoming links

– average concentration of anchor text

Greg then showed an example profile for an *Online Poker* site – to give us an idea of the data he’s dealing with. Each of these profiles will be different. This particular profile proves that it’s going to be a hard space to get results in using link strategy. Provides the blueprint

Step 2: Tactics

Once you have the big picture – your blueprint and now it’s time to look at the individual tactices each competitor is uins to gain links

– make a list of the top 2-3 tactics

– usually competitors will only be using 1 or 2 tactics, giving you the opportunity to overtake their efforts by using all tactics available to you.

– Yes, you can purchase links – provided they fit the criteria established in your blueprint.

– Get the quality links first – because as links age, they count more.

– When you get to the end of the process, your link map will look just how you planned it.

Tools

– Majestic SEO is highly recommended by Greg

– Linkscape

– Use tools that go out to get data then compare with your own

– Site Explorer

– RavenTools

– SEOBook Toolbar

During Q and A, Greg Boser said that if you want to target Australia and get good link popularity in Oz, make sure 90% of your links are from Australian sites.

SMX Sydney 2010: Keyword Research – Beyond the Ordinary

chris2-podiumNext Advanced SEO session is by Chris Dimmock of Cogentis to talk about SEO KW Research – Beyond the Ordinary.

You’ve mastered the KW tools – what next?

– find significant kws’

– uncover competitor’s keywords

– mine the long tail for elusive but valuable obscure search terms

– expand your *virtual shelf space* in surprising ways

KW Tools:

Adwords KW research tool is valuable, but keepin mind it’s just Google data

Adwords Traffic estimator

MS Ad Intelligence for Office 2007 – need an account Bing/msn – has a KW extraction tool – handy for importing from a URL.

WordTracker

Traditional Advice for KW Research

1) brainstorming session – but ask outsiders so you don’t get caught up in your own jargon
2) read your log files – only works if people are already finding you for those terms
3) Lots of time consuming grindy work with s/sheets – safest way

Chris’s Advice for KW Research

– singulars and plurals

– word order e.g. sydney hotels vs hotels sydney

– industry jargon falls into long tail

– searcher intent and your goals- make sure your kws become the conduit

-testing and analystics are key. Don’g be afraid to use SEM to tst your SEO KW coversion rates

– Define your goals- conversions, traffic, newsletter, signups etc. Does the target page provide access to the goal outcome.

Overview of Paid Tools

1) KW Discovery – gives you AU data

2) Word Tracker

3) SEMRush

Look at Current Trends for KW Research

– googletrends

– hot aol

– zeitgeist

– Twitter trends

– GG Insights for Search

– Yahoo Buzz – Buzzlog.buzz.yahoo.com

– Google Buzz

– Trendistic

Chris says there’s a lot of talk about using social media to predict the future. Traditional kw tools don’t help when you’re trying to do this. Sometimes there is no recent history. So take an educated guess in that case and cover your bases.

Try KWSpy etc.

The Google Search based KW Tool is very powerful – allows you to research your competitors

– Remember to research misspellings

– Try MS adCenter Labs

Geographical Sales Expansion

– different markets have different keywords to describe things e.g. accommodation vs lodging vs accommodations

Conclusion

– Use KW tools

– Use PPC to test conversions before you SEO

– Don’t believe the numerical stats from tools

– Test or you’re wasting your time