One of the most important ways to build authoritativeness and attract more relevant traffic for your website is to continually add new keyword targeted content to your website. However, the content on your website won’t benefit your website’s rankings in search engine result pages unless it’s original information.
In order for your site content to be beneficial for search engine rankings, it must be at least 60% original text. That means if you find a topic online that you would like to write about for your site, you can use the basic idea but you should completely rewrite the content so it is as original as possible when it’s included on your website.
Legitimate Types of Duplicate Content
There are circumstances in which it’s acceptable to use duplicate content on your site, so to avoid duplicate content issues, you can make sure Google doesn’t index the duplicate pages. Some examples of duplicate site content are:
Regular site content with a printer friendly version of the same text
Duplicated pages for different membership levels on your website
Blogs or content management systems that show the same content in several different locations (homepage, archive section, category labels)
If Google determines you are deliberately copying information from other sites to build your site content and pages, your website may be penalized and lose search engine rankings. In severe cases, it may even be removed from the Google search engine results pages entirely. Therefore, it’s important to handle duplicate content on your site carefully so as to not make it appear deceptive to search engines.
Google Safe Duplicate Content Strategies
When there is a circumstance in which you need to include duplicate content on your website, there are legitimate, search engine friendly strategies you can use to avoid getting penalized for it. Here are some common, acceptable strategies of handling duplicate content on your site:
Block duplicate pages. You can prevent search engines from indexing duplicate pages by blocking them in your website’s robots.txt file.
Use 301 redirects. Redirect all visitors and search engines to one version of your site with the use of a 301 redirect in the .htaccess file or administrative controls.
Keep links consistent. Always use the same format to link to internal website pages, such as http://www.website.com/page.html (rather than http://www.website.com/page or http://website.com/page
Avoid text repetition. If you have copyright text or the same text repeated on every page of your website, avoid duplication by linking to one specific page with the text rather than including it on every page.
Though these are some of the most common strategies to safely avoid duplicate content issues, there are many other techniques you can use as well. Just remember that you should never try to deceive or trick the search engines to increase your search engine ratings. For more information on duplicate content issues, visit Google’s Duplicate Content Help Page.
By Lisa RegallOrganic Search Analyst