I’ve submitted my sitemap to Google several times, and it doesn’t spider more than 57 pages even when I add more pages. I can’t figure out why and would really appreciate your help!
My website is [URL withheld]. The sitemap I submit to google is called sitemap.xml. I’m working on the site currently, and I want google to find the changes and new pages.
I’ve had a look at your sitemap and your site and I’ve worked out the problem. I think you’re going to laugh 🙂
Yes, you have created a XML sitemap containing all your site URLs. Yes, you have uploaded it via your Webmaster Tools account. However, the robots.txt file on your site contains disallow rules that contradict your sitemap.
There are over 30 URLs in your robots.txt with a disallow instruction for Googlebot. Essentially, you are giving Google a list of your pages and then instructing the search giant not to go near them! Have you re-designed your site lately? Maybe your site programmers made the change during a large site edit or testing phase and forgot to remove the URLs after completion?
All you need to do is edit your robots.txt file to remove the URLs being disallowed and then resubmit your XML sitemap.
All the best.
Need to learn SEO but not sure where to start? Access your Free SEO Lessons. No catch!