Q and A: Why is my site showing as redirected in the Google Search Console?

QuestionHi Kalena,

I just checked my Google Search Console and under Crawl, Fetch as Google I see my site status is showing as redirected, with a yellow sign beside it!

What does that mean and how can I fix it,  please?

Regards
Zara

————————————-

Hi Zara

First up, don’t panic. The *Fetch as Google* tool simply tries to imitate what Googlebot sees when it crawls your site code. It doesn’t mean that Google can’t index your site. If you are still seeing data in the Search Console relating to your site status, all is well. If there were major indexing issues, you would see errors in the Crawl Errors and Site Appearance categories.

As for WHY you are seeing a redirect status, without knowing your site URL, I can only guess. But I’m confident it will be one of these reasons:

1)  Domain canonicalization: Does your site load using both http:// and http://www. or does one version redirect to the other? Have you set a preferred domain in Google Search Console? To do this you need to set up views for both versions in Google Search Console and then set your preference. After you’ve done that, depending on which domain version you open within the Console, the *Fetch as Google* tool will show a different status.

complete-fetch

1) Fetch as Google results for http://www.searchenginecollege.com

2) Fetch as Google results for http://searchenginecollege.com

2) Fetch as Google results for http://searchenginecollege.com

For example, to avoid duplicate content issues in Google, I have set a) http://www.searchenginecollege.com as my preferred domain in my Search Console, but I also have a Console view for b) http://searchenginecollege.com. On the server side, I have set the non www version to redirect to the www version. This is known by several names, including URL redirecting, domain masking, piggy-backing and parking. If I fetch a) as Google, I see the status as shown in the first image. If I fetch b) as Google, I see the yellow *redirected* status as shown in the second image.

This is likely what you’re seeing and it simply means you have set up your domain redirect correctly. Learn more about domain canonicalization and how to set your preferred domain.

2) 301 or 302 redirects: Have you recently switched domains? Although Googlebot follows redirects, the *Fetch as Google* tool does not. So if your site was originally set up in the Google Search Console as one domain e.g. http://www.siteA.com but has now moved to http://www.siteB.com and you set up a 301 or 302 redirect server side, then if you are looking at the original site view in the Console, it will show up as redirected in the crawl tool. You can inspect the HTTP response on the fetch details page to see the redirect details. Learn how to do this.

3) Moving to SSL:  If you have recently updated your site from http:// to https:// and you’re seeing *redirected* in the crawl status, you may have the same domain canonicalization issue as 1). So you need to set up a view for the https:// version of your site in Google Search Console. More info on SSL issues here and here.

Hope this helps!

——————————————————————-

Like to teach yourself digital marketing online? Start here.

 

Spread the joy!

Q and A: How long will it take our site to be purged from Google?

QuestionHi Kalena,

It’s Tim here. I’m the developer for a website – [URL removed for privacy reasons] – and as of Thursday or Friday last week, Google has crawled my whole site. It shouldn’t have been able to do this, but it has.

Part of the site is written in PHP and Google has cached all the pages, several of which contain information that shouldn’t really be in the public domain.

I’ve submitted the FQDN to Google asking them to remove the URL which will hopefully prevent any results being shown in a Google search. The request is currently in a ‘pending’ state and I’m wondering how long this would actually take to be purged.

Thanks,

Tim

————————————-

Hi Tim

I’ve not personally lodged a take down request with Google, so I’m afraid I’m not speaking from experience, however I’ve had colleagues tell me this can take up to 3 months if a site has already been crawled.

Your email doesn’t make it clear what happened, but it may also depend on how sensitive the content is and why it was indexed in the first place.

A couple of things you can do while you’re waiting:

1) If Google managed to crawl your whole site, you might have conflicting instructions in your robots.txt file, your robots meta tags on those pages or you might be including content you don’t want public in your sitemap.xml file that Google is indexing. Check all those areas so the problem doesn’t re-occur.

2) Ask Google to remove content through the Webmaster Search Console. This is often faster than the formal take down request you submitted via email. It requires you to verify ownership / admin of the site via the Search Console first.

Keep in mind that even after you’ve blocked the pages from being indexed, they can take a while to fall off the Google search results, depending on the number of data-centers that have cached them and where they are serving results from.

Best of luck!

——————————————————————-

Like to teach yourself AdWords? Start here.

 

Spread the joy!

Q and A: Should we use commas and full stops to separate keywords in the title tag?

QuestionHi Kalena

I’ve read that Google ignores them, but I’ve seen commas as well as full stops (periods) used in title tags as a way to separate keywords for Google and not just visually.

For example:

< title > Acme Company. SEO service in New York, SEO audit New York.< /title >

Here we have two different and distinct keyword phrases: “SEO service in New York” and “SEO audit New York”. It is clear enough.

If Google ignores full stops and commas, there are many more keywords: “SEO service”, “New York”, “SEO in New York”, “SEO New York”, “audit in New York”, “SEO” and so on…

I know that the best practice is to optimize each page for 1 or 2 keywords, certainly not more than 3 keywords. So what is your opinion on the use of commas and full stops to separate keywords in the title tag?

Thank you in advance!

Max

————————————-

Hi Max

First up, the impact on search results of using punctuation in title tags is minimal. Google usually ignores commas and separation symbols. You should use punctuation primarily to write grammatically correct titles that make sense to humans.

Commas (,) should be used as commas, not separators. Full stops should be used to end a logical sentence. However, pipes (|) and hyphens (-) can be used as separators. Colons (:) aren’t ignored and imply to search engines that what follows is a subtitle or explanation / elaboration.  Hyphens can also sometimes be interpreted as colons. As a separator, the pipe is usually preferable to use because of its small pixel width.

Having said all that, the best option is to use as little keyword real estate as possible in your title tag, so that means combining your keywords into phrases that cover several individual keywords / phrases and not repeating keywords unnecessarily.

So, for example, if you are targeting the 2 phrases: “SEO service in New York” and “SEO audit New York”, then I would create the title tag as follows:

1) < title > SEO services and audits in New York City | Acme Company < /title >

instead of the longer:

2) < title > Acme Company: SEO service in New York and SEO audit New York < /title >

Notice that my version takes up less space, but now has no keyword repetition and includes the plurals *services* and *audits* as well as the longer *New York City* instead of the shortened version. This means that my title tag is optimized for a wider range of search terms, even though it is shorter in character length. It also includes the company name at the end of the tag, separated by a pipe.

In fact, a better version might be:

< title > Search engine optimization services and SEO audits in New York City | Acme Company < /title >

Although longer, this still falls within the accepted character count for the title and would be a relevant match for search queries by persons using the long form *search engine optimization* as well as the shortened version *SEO*.

Hope this helps.

——————————————————————-

Like to learn SEO? Access your Free SEO Lessons. No catch!

 

Spread the joy!

105 Free SEO Resources

toolkitIf you’re a regular reader of this blog, you’ll know that I’m a big fan of time-saving tools and short cuts that can make life easier for digital marketers.

On a recent trip down the rabbit hole of online marketing blogs, I came across the a tidy collection of SEO resources collated by Amar Hussain of website broking company FE International.

Pitched as the ultimate toolkit for digital marketers, the collection is unique in that all of the resources are free. This is great news for marketers on a budget and ideal for my SEO students, many of whom are still in college or on low incomes.

Each of the resources are categorized along the following themes:

  • A/B Testing
  • Analytics
  • Competitor Analysis
  • Content
  • Diagnostic
  • Email
  • Infographics
  • Keyword Research
  • Link Research  / Link Building
  • Local SEO
  • Resources
  • SERP Tracking
  • Speed
  • Technical
  • Toolbars / Extensions
  • WordPress Plugins

While there are many tools in the list that I know well, I was pleasantly surprised to see a large number that I haven’t seen before and can add to my own toolkit. Of these, Optimizely, WordSmith and Five Second Test were the most exciting finds.

Enjoy.

Spread the joy!

Q and A: Is it good SEO practice to have navigation menus in both header and footer?

QuestionHello Kalena

May I ask you about navigation menus on a site?

Imagine that a web site has two navigation menu blocks – on the header and on the footer of the site. Some buttons/links are doubled (or even all the links).

On one hand, it’s good for site’s visitors. When they reach the bottom of each page, there is no need to scroll up it to find and click on the necessary navigation button.

On the other hand, we all know that doubled links to the same page are not good. Bots can consider such practice as an attempt to give more additional weight to the page. Moreover, doubled navigation links together with the all other page links may exceed the number of 100.

However, if it stands for usability, site design should give visitors an opportunity to find the necessary buttons in a quick way.

My question is “Is it good to add doubled navigation menu to header and footer of any web site?”

What is your opinion on this topic? I’d highly appreciate your answer.

Thank you in advance!

Maksim

————————————-

Hi Maksim

The answer depends on a few factors:

1) Is your main navigation menu built with Javascript (e.g. drop-down menu) or other functionality that search engines may have difficulty indexing? If the answer is yes, then it might be a good idea to include a plain text navigation menu in your footer to ensure that search robots can index the links.

If the answer is no, the main navigation is already search engine friendly, then there is no need to duplicate it, in my opinion. Keep in mind that the more links you have per page, the less PageRank value each link passes to the linked page. So you can dilute the value of each page on your site if you’re not careful. Also, Google recommends you keep the number of links per page to a maximum of 100 or they may not all be indexed.

2) Does the addition of another menu help the usability of the site? i.e. is the page content so complex that visitors may require the second navigation menu to help them navigate around? If yes, then include the extra menu. If no, then… well you know the answer.

I guess the important thing is to make the decision with visitors in mind foremost and search engines as a secondary consideration.

——————————————————————-

Like to learn SEO? Access your Free SEO Lessons. No catch!

 

Spread the joy!