Factbrowser: the research discovery engine

Blog imageYou might remember a couple of weeks ago I wrote a piece about How to Find Compelling Internet Statistics?

Well after that post I heard from Keith Anderson who wanted to introduce me to a site he founded called Factbrowser.com.

Keith created Factbrowser about a year ago to help people discover the most compelling new research about technology, business, consumers, specific regions and the Internet.

The site is totally free, and it’s updated daily with new reports from hundreds of credible sources like Nielsen, NPD, IDC, Pew, gathered from press releases, social media posts and newsfeeds.

The entire database is searchable and filterable by topic, source and region, so you can narrow down the most relevant research quite quickly. It also uses quite a detailed topic tagging system if you like that sort of thing.

Each snippet of data also has social sharing buttons in case you want to share it with your online community with one click. But what I find best of all about the site is that the source of the data is clearly highlighted in red, together with a link to their web site and Twitter account if available.

Great job Keith and thanks for sharing.

 

Share this post with others

SMX Sydney 2012 – Anne Kennedy – Duplication, syndication & all that other mess

This is a summary of Anne Kennedy’s presentation at Search Marketing Expo / Online Marketer Conference held in Sydney 1-2 May 2012.Anne Kennedy at SMX Sydney 2012

Anne Kennedy has co-authored the first book on international SEO and PPC, called Global Search Engine Marketing. Anne provides search engine consulting to hundreds of companies worldwide and formed an international online marketing consortium with Nordic eMarketing in Reykjavik, London, Stockholm, Rome and Beijing.

Duplicate content happens, says Anne. URL duplication is a big one. This is where you see several different versions of the same page being indexed and/or linked to. For example:

– http://www.site.com
– http://site.com
– http://www.site.com/index.shtml
– http://iste.com/index.shtml

and so on.

You should always use the Rel=canonical tag to lose the canonical versions of pages and also let Google know in Webmaster Tools which version of your pages to index.

Anne says to watch your crawl budget. Your crawl budget is the percentage of your site that Googlebot will crawl. Googlebot rarely crawls your entire site, so keep your low quality pages out of the index by excluding them from your sitemap and blocking them using robots.txt.

Common Duplication Causes

A very common duplicate content mistake is to have printer-friendly versions of your content. Whatever you do, lose the print friendly versions from your sitemap!

Use 301 redirects on your pages, but only when necessary because not all link value will transfer to your replacement pages. PageRank will not transfer 100 percent over to pages if you 301 redirect them – keep that in mind.

Think about using a separate XML feed for your product pages, says Anne. Separate out your e-commerce or product-specific pages from your main sitemap and create a sitemap just for them. Upload the two sitemaps separately in your Google Webmaster Tools account.

Content syndication and site scraping can cause duplicate content headaches. If you are an article syndicator or blogger, make sure you link back to the original article with the title in the anchor text within the article, not the footer, because some syndications sites strip links out of footers. Require syndicators to use the canonical url version or require a no index (exclusion) of the article link in their robots.txt. This will ensure Google finds the original article more easily.

Another trick is to give syndicators a logo or image to go with the article that contains a link to your article and article title in the alt tag of the logo/image. Syndicators will often miss those.

Be sure to update your XML sitemap immediately whenever you publish a new article or blog post – you can use WordPress plugins to update your sitemap automatically for this.

If your article is out of date or no longer accurate and you want it gone from the SERPs for good, use a 410 code to tell Google the article is GONE. This is a more permanent solution than 404.

Dont put your international content on your English TLD. If you want your content to rank well in a particular international market, you should put the content on a related TLD e.g. a German language site should site on site.de or at the very least, de.site.com. Your international content will rank better in regional markets if you have links pointing to it from related TLDs e.g. site.de will rank better in Google.de if it has plenty of .de sites linking to it.

And finally – dont leave it up to the bots! Take control of your content.

Share this post with others

Q and A: How Do I Tackle Regional Keyword Issues in SEO?

QuestionHey Kalena,

I’m trying to optimize a site for the first time. Its a fashion jewelry site. I have come up against a couple of stumbling blocks that I need a little clarification on. One is the target market – its a New Zealand website, but we want to target New Zealander’s, Australians and the rest of the world this brings up issues of spelling – do we focus on Jewellery (New Zealand/British spelling), Jewelry (US spelling, but where a lot of the current customers come from) or Jewellry (a common misspelling).

Secondly, I’m having a hard time trying to choose my keyword phrases. Silver jewelry and costume jewelry (which seems to be the most common way people search for fashion jewelry, even though fashion jewelry sounds so much more modern!! – found out through the Google Keyword tool) seem to be the best as they are well searched for. I want to be more specific however i.e *women’s silver jewelry*, or *silver jewellery nz* or *buy silver jewelry* etc. but the search volume according to the Google Keyword tool is well below 20 per day.

Can you please suggest what I should do in this situation?

Thank you!
Mitchell

Hi Mitchell

To answer your questions:

1) The regional spelling issue is a tricky one. There are a few ways you can approach this – do you have the .com as well as the regional Top Level Domains (TLD) .co.nz and .com.au? If so, you can use the American spelling on the .com domain and the British spelling on the regional domains. However, this may create duplicate content issues unless you block robots from the near-duplicate pages.

Alternatively, you can simply use the appropriate language version for your largest target market as the default throughout your site. For example, although we are based in New Zealand, our main target market for Search Engine College is the US, so we use American English throughout our web site. Most regional markets will understand that American English is common on the Internet, so you should not isolate them by doing this.

Another, trickier, option is to use British English on your main site to attract organic local search traffic and then create a Pay Per Click advertising campaign (e.g. Google AdWords) with tailored landing pages and ad text using American English to suit your other markets. Then, run your PPC campaign targeting only those countries where American English is used more commonly, making sure you block search engine robots from indexing your American English landing pages. You could do the reverse if you decide American English should be your default language.

As for misspellings? Those are fantastic for picking up extra traffic your competitors are missing. Best way to get that traffic is by targeting the misspelled keywords within your Pay Per Click campaign or by including the misspellings in your Page Titles and META Tags (the META Keywords tag is a particularly good place for them if you don’t want human visitors to see them).

2) You are spot on wanting to target the longer tail keyword phrases such as *women’s silver jewelry* and *buy silver jewelry* because it is these specific phrases that are more likely to bring you qualified visitors who are more ready to purchase. But the beauty of targeting these longer phrases is that they also contain the more popular shorter search terms such as *silver jewelry* and *women’s jewelry*. So, by default, you are also optimizing your web site for these shorter phrases by integrating the longer ones into your tags and page copy.

Choosing long tail phrases that contain more generic popular search ones is a great way to save valuable keyword real estate in your page titles and meta tags. For example, instead of having to include both *buy silver jewellery*, AND *silver jewellery* in your meta description tag, you only need to include the longer one as it covers both. A META Description tag of “Buy women’s silver jewelry from French Fashions” sounds a lot less redundant than “Buy silver jewelry and women’s silver jewelry and silver jewelry from French Fashions”, don’t you agree?

When researching keywords for multiple international markets, remember to use a keyword research tool that offers regional search data so you can pinpoint what persons are searching for in each country. Apart from regional spelling, regional jargon such as (accommodation vs lodging) can impact keyword search trends considerably.

———————————————–

Finding that optimizing your own site is a challenge? Download our Free SEO Lesson. No catch!

Share this post with others

Q and A: Why Do I Need Regional Search Engines for Link Building?

QuestionDear Kalena

You’ve mentioned on your blog about the importance of using resources to locate regional search engines for link building purposes. Could you explain a little further how one would use a regional search engine, and could you give a concrete example of finding one?

Thanks a lot.

Terry

Hello Terry

The reasons you might want to locate regional search engines include:

1) Your / your client’s web site contains information limited to a particular region / country.
2) Your / your client’s business owns multiple web sites with different TLD e.g. widgets.com, widgets.com.au, widgets.co.nz.
3) Your / your client has multiple country target markets they wish to reach via search engines.

The situations above mean that you need to have the web sites listed in the relevant regional search engines so they can be found by the specific target markets. This is all part of the vital link building process – having your site listed in as many relevant locations on the web as possible. This is especially important now with Google placing more emphasis on local search.

Some regional search engines may find your site automatically using their crawler (e.g. Google.com.au, etc.) but others, such as niche search engines and hand-edited directories, may require you to submit the site/s manually. This is why you need to have a list of regional sites handy so you can check them all for the existence of your site/s and submit them if needed.

A couple of sources you can use to find regional search engines worldwide include:

These sites list different sub categories of search engines for various countries and regions. So, for example, if you were looking for a list of search engines and directories specific to Australia, you would click on the relevant country category and be taken to the Australian list. You could also simply type a search into Google for *list of Australian search engines* and find other lists.

You should do this for every country market that your / your client web site targets.

Kalena

Share this post with others

New Home for Google Realtime Search

Remember when Google promised us they were getting close to being able to provide search results in real-time? Well this week they’ve cracked it.

In an official blog post, Google announced real-time search results are now available. But instead of being integrated into regular search results pages, real-time search has been given it’s own home – a dedicated page for people to conduct searches in real-time.

You can also access Realtime Search by clicking the “Updates” link in the left-hand panel of normal search results. The results appear as a constantly refreshing stream. Your Google Alerts also work with Realtime Search so you can be sent updates for your target searches within minutes of them appearing in Realtime Search.

We’ve been able to see some real-time results in SERPs already, with social search results containing recent Twitter posts and Facebook status updates, however being able to isolate real-time search results from regular organic search results is extremely useful, especially if you are looking for information relating to an event in a specific location or a developing news story.

A couple of handy new features allow you to refine Realtime search results by pinpointing results by location or time and you can even see entire conversations to get context about any topic.

For example, the political situation in Australia is currently in turmoil as the country faces a hung parliament as a result of an election draw. Political developments are in flux and it’s difficult to keep up to date. If I conduct a search for “Australian election” using real-time search, I can see tweets from as recently as 1 minute ago and news stories posted within the last hour.

Realtime Search and updates in Google Alerts are available globally in 40 languages, and the geographic refinements and conversations views are available in English, Japanese, Russian and Spanish. To learn more, visit the Google Realtime Search info page.

Share this post with others