Q and A: Can you please answer 3 SEO Questions?

QuestionHello Kalena

I’m a new Search Engine College student and I have 3 questions about SEO:

1) I’ve listed key phrases in a bulleted list.  One of the items listed is not a keyphrase, but does it matter to the search engines where the key phrases are, as far as their order in the list?

2)  I noticed something called “itemprop” in the meta description tag when I look at the source code of my website.  I know this is something to do with “All in One SEO” coding.  If itemprop is in the meta description, will that affect my SERPs?

3) Itemprop seems to be an issue with W3C, and the W3C Code Validator found more than 30 errors with my WordPress theme’s coding.  Could this also affect my SERPs?

4) I wrote more than 300 words for my site, and I’ve been changing words to try and improve the site’s performance over several months.  However, when I type in a key phrase I can’t locate it in Google.  Also, it seems the only way I can find it (on page 5) is when I type in the city with the key phrase.

Any suggestions would be appreciated.

Kind Regards

Ben

————————————–

Hi Ben

I don’t usually answer more than one question per post, but I’m feeling generous today ;-)

To answer your questions:

1) All other things being equal, keywords/phrases at the start of your tag are given slightly more relevancy weight than keywords/phrases towards the end of the tag.

2) I use the All in One SEO Pack plugin for WordPress as well and I’ve never noticed this *itemprop* you speak of. However, it appears to be attribute for embedded items in your code. It shouldn’t have any impact on things as the content of the meta description tag rarely has any influence on your page ranking in the SERPs.

3) Yes, HTML validation can have an impact on how search engines index your code, which can in turn have an impact on how well you rank. If you have used W3C to validate your code and it has found errors, I suggest you try to correct the errors as best you can.

4) SEO is a fluid exercise. You need to constantly tweak and refine your page code and content (and link profile) until your page starts to rank well. As long as you follow the advice in our Search Engine College lessons and on this blog, you should find an improvement over time.

Hope this helps.

——————————————————————–

Like to get geeky and teach yourself SEO? Access your Free SEO Lessons

 

Share this post with others

Q and A: Will changing my PDF document title impact my search rank?

QuestionHi Kalena

When optimizing a PDF, Adobe Acrobat allows users to choose if they want to display the document’s file name or its title in the title bar at the top of the document (File>Properties>Initial View>Windows Options).

During a recent talk about PDF creation I was asked if changing what’s displayed from the default file name to the actual document title would have an impact on search results.

My gut feeling is that it has a positive impact, but I don’t know enough about SEO to actually confirm this. Do you know?

Thanks heaps!

Cheers
Diane

————————————–

Hi Diane

Your gut is right! The way you name your PDF file can impact where it ranks in search results.

A lot of webmasters believe that PDFs can’t be indexed, but in fact, Google has been able to index PDF files since 2001. Despite the different encodings used in PDFs, Google can extract useful data from them, provided they’re not encrypted or password protected. If text is embedded as images, Google can even process the images with OCR algorithms to extract the text.

Just like other web pages, PDF files have the ability to rank highly in search results, depending on their content, if they have been optimized and also depending on the way they’re embedded and linked to from other web pages.

Google uses two main elements to determine the title shown for PDFs: the title meta-data within the file, and the anchor text of links pointing to the PDF file. You can influence the title shown in search results for your PDF document by updating both. Doing this gives the algorithms a strong signal about which title to use.

Links embedded in PDF files are treated similarly to links in HTML: they can pass PageRank and other indexing signals, and Google may follow them after crawling the PDF file.

You can pick up some more tips for optimizing PDF files in these resources:

Hope this helps.

——————————————————————–

Like to learn more about SEO methods? Access your Free SEO Lessons. No catch!

 

Share this post with others

Q and A: Is re-writing content from another blog a legitimate SEO tactic?

QuestionHello Kalena

My sister has just hired a SEO company based in The Philippines to provide weekly content for her company blog. As I’m a bit more web savvy than she is, she asked me to look over their service outline just to be sure she made the right decision.

Problem is, this “Google optimized content” they are providing seems to consist of copying popular blog posts from other sites in the same industry (women’s health and beauty) and re-writing them in a slightly different way before publishing. I don’t know a lot about SEO, but I am sceptical that Google would approve it. Besides the SEO consideration, this tactic just doesn’t sit right with me.

Is this a legitimate SEO tactic or could it harm my sister’s site in any way?

Thank you

Leon

————————————–

Hi Leon

You are absolutely right to be sceptical. By the sound of things, this *SEO* firm employs a technique called site scraping – where the content of other sites is copied or “scraped” and either republished unchanged on a different site, or re-written slightly and THEN republished.

Long term readers of this blog might recall my hilarious battle with site scrapers in the past and the revenge I took on them. I’ve got no problem outing site scrapers, especially when all attempts at communication have been ignored. Their tactics are not only unprofessional, but go directly against Google’s published Webmaster Guidelines.

Take BrainySEO for example. This “blog” (run by some clown called Mayank Jain in Singapore) blatantly scrapes the content of hundreds of blogs across the net, including mine. What’s hilarious is that the scraped content is run through some bizarre automated plagiarist thesaurus (I’m guessing Babel Fish) to translate it into a slightly different version of the same content as a way to avoid Google’s duplicate content filters. It is then published on servers based in the UK.

Compare these two posts:

1) My Fast Five post from week 39 (original)

2) BrainySEO’s scraped Babel Fish version (scraped)

The second (scraped) version reads like a drunk Aunty.

The service that your sister has signed up for sounds suspiciously similar. As Google re-iterates in their Quality Guidelines:

“Scraped content will not provide any added value to your users without additional useful services or content provided by your site; it may also constitute copyright infringement in some cases”.

Typically, Google and other engines will ignore or filter scraped content out of the search results for target search terms. But that’s not the only negative impact it can have.

Sites like ScamAudit.com provide a rudimentary way of measuring the trustworthiness of sites and suitably, BrainySEO is ranked as *seems suspicious*.

So my prediction is at best, the content your sister pays for will be worthless. At worst, it may impact the reputation of her business and the trust of her customers.

My advice is that she should sever the contract immediately, perhaps citing Google’s Quality Guidelines as her justification.

Let us know what happens!

——————————————————————–

Need to learn more about legitimate SEO tactics but not sure where to start? Access your Free SEO Lessons. No catch!

 

Share this post with others

Fast Five in Search – Week 38, 2014

fast-five

 

So I answer a lot of questions about search engines on this blog. But did you know that Google also has a Q and A site? This week’s Fast Five is dedicated to some of the more popular questions asked about Google.

Here’s this week’s Fast Five:

1) Does validating my site’s code (with a tool such as the W3C validator) help my site’s ranking in Google?

2) How can I get those links displayed under my site’s listing in Google’s search results like some other sites have?

3) Is the server location important for geotargeting?

4) Why doesn’t my site show rich snippets? I added everything and the test tool shows it’s ok.

and finally…

5) Why is my sitemap file showing a submitted URL count that does not match the number of entries in my sitemap file?

Happy reading!

*Image courtesy of Threadless.

——————————————————————–

Need to learn SEO but not sure where to start? Download your Free SEO Lesson. No catch!

 

Share this post with others

Q and A: Will Google penalize me for redirecting my old site to my new site with duplicate content?

QuestionHello Kalena

I have a current subdomain webpage that is ranking on page 12 on the Google SERP’s. I just bought a new domain name and created a new website with pretty much duplicate content so I could use that as my prime domain. What I did was re-direct my subdomain to the new prime URL.

My new site has been indexed, but not yet ranked by Google. I intend to delete the sub-domain page as soon as the new page starts appearing in the SERP’s. My question is, because of the duplicate content, is Google going to ban me for this?

Thanks,
Paul

————————————–

Hi Paul

Ah yes, the old hosted sub-domain ranking conundrum.

For the benefit of readers who might not understand your situation, here’s a brief explanation. Paul’s current website is free-hosted on a sub-domain provided by his hosting company. For example, instead of having his site at www.PaulsPlace.com, it’s currently at PaulsPlace.hostingplace.com. This means that any links pointing to his site contribute to the hosting site’s link popularity and not his own. It also means that he is helping his hosting company to rank better in search engines, rather than his own brand and content.

To avoid this, Paul has done the right thing and purchased his own domain name, transferring all his site content over to the new domain and then putting an automatic sign-post up on his current sub-domain site that redirects people to his new domain when they hit his old site or click on a link to his old site within search engine results.

Paul, provided you used a 301 redirect on your sub-domain, there shouldn’t be any problem at all with duplicate content. In fact, this is the recommended process to use, according to Google. Just don’t forget to remove the redirect (and dump your old site) once you see your pages start to appear in the search results. You can hurry this along by creating a XML sitemap for the new site and uploading it to Google via Webmaster Tools.

Hope this helps.

——————————————————————–

Need to learn SEO but not sure where to start? Access your Free SEO Lessons. No catch!

 

Share this post with others