Hi Kalena
Quick question.
One of the lessons in your SEO201 course says that if you run PPC campaigns and use landing pages for these campaigns that are similar in layout and content, you should prevent search engines robots from indexing them. Please explain why?
In my thoughts, the more files the search engines index the more exposure you may get.
Thanks
Alex
—————————————————————————————————
Hi Alex
PPC landing pages can often look nearly identical and the only difference between them is the target keywords used. Web pages that look too similar are sometimes filtered out of the search results as duplicate content. Too much duplicate content on a domain may impact it’s ability to rank highly, therefore I always recommend preventing robots from indexing landing pages, using your robots.txt file.
If you are using existing pages on your site as PPC landing pages and they aren’t too similar to each other, there is no need to block robots from indexing them. Make sense?
Additionally, if you don’t want to bother adding new directives in your robots.txt file every time you add a new landing page, create a landing page folder. Block the folder from search engine spiders with your robots.txt. Now every time you add a new landing page, just make sure to put it in your landing page folder.
The other way of doing it is via Google’s canonical tag (http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html), but I much prefer robots.txt.
@saurav – Yes, excellent point, thanks for that. Just be sure that your PPC ads point to the correct URL, including sub-folder, for those landing pages if you decide to put them in their own folder.
What’s the impact on the landing page’s quality score if you block it with the robots.txt? If Google can’t assign a strong value to the LP due to blocking with robots.txt will that not hurt your PPC bids?
Yikes. To be perfectly honest, I’ve not considered that as I assumed AdWords editorial reviews were separated technically from regular robot indexing. With so many advertisers blocking LPs, I would hope that quality scoring is a stand-alone function? Will do some research into this and add to the post.
Another option is to consider using Google Website Optimizer for your landing pages and that way the highest converting one will always be shown and the others ignored
Did a quick Twitter poll about this and the consensus is to use noindex, follow on your landing pages. This allows GG to crawl but not index the pages and should alleviate any concerns about your Quality Scores being influenced by GG’s inability to reach the page.
Kalena, we block PPC landing pages via the robot.txt file and still achieve 10/10 on QS. However, Google may be relying on the the outstanding history of the account in question. We’re going to conduct some further testing across other accounts to see what transpires.
Does anyone have any further on the no-indexing effecting Quality Score? I am having the same question.
From our experience blocking crawlers from indexing your PPC landing page does not affect quality score. We’re still able to achieve 10’s across most landing pages using either the robots.txt file or Meta NOINDEX.
This is useful and directly from Google:
https://adwords.google.com/support/aw/bin/answer.py?hl=en&query=landing+page+quality+bot&answer=38197&type=f