One of the lessons in your SEO201 course says that if you run PPC campaigns and use landing pages for these campaigns that are similar in layout and content, you should prevent search engines robots from indexing them. Please explain why?
In my thoughts, the more files the search engines index the more exposure you may get.
PPC landing pages can often look nearly identical and the only difference between them is the target keywords used. Web pages that look too similar are sometimes filtered out of the search results as duplicate content. Too much duplicate content on a domain may impact it’s ability to rank highly, therefore I always recommend preventing robots from indexing landing pages, using your robots.txt file.
If you are using existing pages on your site as PPC landing pages and they aren’t too similar to each other, there is no need to block robots from indexing them. Make sense?