Q and A: Is re-writing content from another blog a legitimate SEO tactic?

QuestionHello Kalena

My sister has just hired a SEO company based in The Philippines to provide weekly content for her company blog. As I’m a bit more web savvy than she is, she asked me to look over their service outline just to be sure she made the right decision.

Problem is, this “Google optimized content” they are providing seems to consist of copying popular blog posts from other sites in the same industry (women’s health and beauty) and re-writing them in a slightly different way before publishing. I don’t know a lot about SEO, but I am sceptical that Google would approve it. Besides the SEO consideration, this tactic just doesn’t sit right with me.

Is this a legitimate SEO tactic or could it harm my sister’s site in any way?

Thank you

Leon

————————————–

Hi Leon

You are absolutely right to be sceptical. By the sound of things, this *SEO* firm employs a technique called site scraping – where the content of other sites is copied or “scraped” and either republished unchanged on a different site, or re-written slightly and THEN republished.

Long term readers of this blog might recall my hilarious battle with site scrapers in the past and the revenge I took on them. I’ve got no problem outing site scrapers, especially when all attempts at communication have been ignored. Their tactics are not only unprofessional, but go directly against Google’s published Webmaster Guidelines.

Take BrainySEO for example. This “blog” (run by some clown called Mayank Jain in Singapore) blatantly scrapes the content of hundreds of blogs across the net, including mine. What’s hilarious is that the scraped content is run through some bizarre automated plagiarist thesaurus (I’m guessing Babel Fish) to translate it into a slightly different version of the same content as a way to avoid Google’s duplicate content filters. It is then published on servers based in the UK.

Compare these two posts:

1) My Fast Five post from week 39 (original)

2) BrainySEO’s scraped Babel Fish version (scraped)

The second (scraped) version reads like a drunk Aunty.

The service that your sister has signed up for sounds suspiciously similar. As Google re-iterates in their Quality Guidelines:

“Scraped content will not provide any added value to your users without additional useful services or content provided by your site; it may also constitute copyright infringement in some cases”.

Typically, Google and other engines will ignore or filter scraped content out of the search results for target search terms. But that’s not the only negative impact it can have.

Sites like ScamAudit.com provide a rudimentary way of measuring the trustworthiness of sites and suitably, BrainySEO is ranked as *seems suspicious*.

So my prediction is at best, the content your sister pays for will be worthless. At worst, it may impact the reputation of her business and the trust of her customers.

My advice is that she should sever the contract immediately, perhaps citing Google’s Quality Guidelines as her justification.

Let us know what happens!

——————————————————————–

Need to learn more about legitimate SEO tactics but not sure where to start? Access your Free SEO Lessons. No catch!

 

Share this post with others

Fast Five in Search – Week 38, 2014

fast-five

 

So I answer a lot of questions about search engines on this blog. But did you know that Google also has a Q and A site? This week’s Fast Five is dedicated to some of the more popular questions asked about Google.

Here’s this week’s Fast Five:

1) Does validating my site’s code (with a tool such as the W3C validator) help my site’s ranking in Google?

2) How can I get those links displayed under my site’s listing in Google’s search results like some other sites have?

3) Is the server location important for geotargeting?

4) Why doesn’t my site show rich snippets? I added everything and the test tool shows it’s ok.

and finally…

5) Why is my sitemap file showing a submitted URL count that does not match the number of entries in my sitemap file?

Happy reading!

*Image courtesy of Threadless.

——————————————————————–

Need to learn SEO but not sure where to start? Download your Free SEO Lesson. No catch!

 

Share this post with others

Q and A: Will Google penalize me for redirecting my old site to my new site with duplicate content?

QuestionHello Kalena

I have a current subdomain webpage that is ranking on page 12 on the Google SERP’s. I just bought a new domain name and created a new website with pretty much duplicate content so I could use that as my prime domain. What I did was re-direct my subdomain to the new prime URL.

My new site has been indexed, but not yet ranked by Google. I intend to delete the sub-domain page as soon as the new page starts appearing in the SERP’s. My question is, because of the duplicate content, is Google going to ban me for this?

Thanks,
Paul

————————————–

Hi Paul

Ah yes, the old hosted sub-domain ranking conundrum.

For the benefit of readers who might not understand your situation, here’s a brief explanation. Paul’s current website is free-hosted on a sub-domain provided by his hosting company. For example, instead of having his site at www.PaulsPlace.com, it’s currently at PaulsPlace.hostingplace.com. This means that any links pointing to his site contribute to the hosting site’s link popularity and not his own. It also means that he is helping his hosting company to rank better in search engines, rather than his own brand and content.

To avoid this, Paul has done the right thing and purchased his own domain name, transferring all his site content over to the new domain and then putting an automatic sign-post up on his current sub-domain site that redirects people to his new domain when they hit his old site or click on a link to his old site within search engine results.

Paul, provided you used a 301 redirect on your sub-domain, there shouldn’t be any problem at all with duplicate content. In fact, this is the recommended process to use, according to Google. Just don’t forget to remove the redirect (and dump your old site) once you see your pages start to appear in the search results. You can hurry this along by creating a XML sitemap for the new site and uploading it to Google via Webmaster Tools.

Hope this helps.

——————————————————————–

Need to learn SEO but not sure where to start? Access your Free SEO Lessons. No catch!

 

Share this post with others

Fast Five in Search – Week 36, 2014

fast-five

 

It’s that time of the week again – Fast Five time. I’m feeling quite smug that I’ve managed to consistently publish a Fast Five post every week this year to date. Blogging can be a time-consuming business, but when you follow a schedule and write about topics that educate and inform, it’s also very rewarding.

If you’re enjoying these Fast Five posts, I’d love to hear as much in the comments. Feel free to suggest some topics for future editions as well. This week, we’re going to take a look back at the five most popular Q and A posts on this blog since it was first launched.

Here’s this week’s Fast Five:

1) How much should I expect to pay for SEO services? by Peter Newsome. In this post from 2009, Pete helps a SEO start-up who are struggling to set realistic rates for their brand new SEO service offering.

2) How do I avoid duplicate content created by my CMS for product pages on my site? by Peter Newsome. Another post from guest blogger Pete sees him helping a webmaster who is concerned that his Content Management System may generate product pages that are so similar in content that they may be deemed duplicate content by search engines.

3) How do I leverage Social Media to improve my SEO? by Yours Truly. In this Q and A from 2012, I explain why social media has become an integral part of SEO and suggest several ways of integrating social media marketing into your existing SEO strategy.

4) How can I get rid of malicious spam content on Google? by Yours Truly. A Q and A from March this year saw someone contact me asking for help relating to malicious content being published about them. I gave advice on how to lodge a Request to Remove Objectionable Content.

and finally…

5) Why doesn’t Google index my sitemap? by Yours Truly. In one of my more recent Q and A’s, I help a webmaster who couldn’t understand why Google wasn’t indexing all his site pages, despite including them all in his sitemap.

If you’ve got a burning question about search or search engines and you want to see it featured here as a Q and A, submit it via this form.

*Image courtesy of Threadless.

——————————————————————–

Need to learn SEO but not sure where to start? Download your Free SEO Lesson. No catch!

 

Share this post with others

Q and A: Do I need to use rel=canonical to tell Google my preferred domain?

QuestionHello Kalena

I’ve been a reader of your blog for many years but have never submitted a question. Until now!

With Google’s recent changes to the algorithm, we have noticed a drop in traffic and rankings for our site (we sell ready-made crafting kits for kids). I suspect it might be related to duplicate content as I’ve been reading how Google will penalize sites that can be loaded with www and also without www. Our site loads for both addresses and I’m worried this means we have been penalized.

I also read that you can fix this issue by using coding called rel=canonical or something like that? I have looked into this briefly, but to be honest, although I’m responsible for the content of our site, I’m a sales and marketing person, not a programmer and I don’t think I have the coding knowledge to use this tool.

Is there a more simple way I can remove the duplicate pages or have our site load just with the www? Or will I need to pay our original web designers to fix this?

Thanks for any advice

Sally

————————————–

Hello Sally

Sorry to hear of your traffic drop, but I highly doubt it is due to your site loading for both www and non-www versions of your domain. The algorithm changes over the past 18 months have been related to more complex issues than domain versions.

Even if Google has indexed both versions of your domain, the algorithm is almost always able to distinguish content that loads on both as one and the same. In this situation, Google will usually choose one version and consistently show that version in the search results.

But if you want to instruct Google which version to use in the search results, you can do this from within your Webmaster Tools account by setting the Preferred Domain (sometimes this is referred to as the canonical domain). The Preferred Domain tool enables you to tell Google if you’d like URLs from your site crawled and indexed using the www version of the domain (http://www.example.com) or the non-www version of the domain (http://example.com).

Simply click on the gear icon at the top right when viewing your Webmaster Tools dashboard and then choose *Site Settings* and the Preferred Domain option will come up as per the image here:

Setting-Preferred-Domain-Screenshot
The recommended use of rel=canonical is on a page by page basis, to indicate to Google which version of a page URL to use, if there are several URLs leading to the same page content.

For example, imagine if these URLs all led to the same page content:

1) http://www.blog.com/blue-suede-shoes/
2) http://www.blog.com/blue-suede-shoes&id=72
3) http://www.blog.com/?p=12890

Now imagine that you only wanted 1) to be shown in Google search results. You could achieve this by adding the rel=canonical link element to the < head > tag of each of those pages, specifying http://www.blog.com/blue-suede-shoes/ as the preferred URL.

However, in your situation, the easiest thing would be to use the Preferred Domain tool in Webmaster Tools.

Hope this helps!

——————————————————————–

Need to learn SEO but not sure where to start? Access your Free SEO Lessons. No catch!

 

Share this post with others