My
Ideal
Host
We do more than just host websites
Customer login
HOME
WEB
HOSTING
RESELLER HOSTING
CUSTOMER SUPPORT
TUTORIALS
COMPANY
View source
for
Google Algorithm Changes Reignite Duplicate Content Debate
<div align="justify">If you’re swimming neck-deep in a sea of Internet marketing, then you are probably all too familiar with the polarizing effects of the duplicate content issue. For those of you who aren’t in the know, duplicate content is defined as the same piece of content posted on multiple sites and blogs across the Internet, and the war has raged for years now regarding whether it hurts a website to republish content that exists elsewhere online. One camp is adamant that duplicate content is a sure-fire way to earn a fierce Google slap for your entire site. The other is hell-bent on the theory that duplicate content does not damage your site in the least. Since Google jealously guards its algorithm, it’s not telling, and this makes both theories precisely that: theory. Since Google introduced its Panda algorithm update last year, the search engine mega-giant has rolled out a series of “refreshes” aimed at removing undesirables from their search engine results. One refresh targeted article directories and websites with spammy, low-quality content; another blasted sites with too many ads appearing above the fold, before visitors scrolled down. These rapid-fire changes have resurrected the great duplicate content debate, and many marketers now question whether it’s still a safe practice to host articles that appear on other websites around the Web. Let’s take a look at what Matt Cutts, the head honcho of Google’s Webspam team himself, has to say on the matter. '''Cutts on Duplicate Content''' Matt Cutts, Webspam extraordinaire, has repeatedly spoken out about the duplicate content issue on his YouTube Webmaster Help platform over the past few years. He has always maintained that most sites that publish duplicate content are not the highest quality sites, and the content that is being duplicated tends to be of equally low quality. Cutts is attempting to say, in the nicest way possible, that duplicate content is web spam, and Google hates it. He clearly points out time and again that one, or at the most, two, versions of a piece of content will be displayed in search results, and Google’s algorithm determines which version is the best fit to be shown. However, he also continuously maintains duplicate content will not penalize an entire site; instead, it will simply cause pages with the content to risk not being indexed. The concern about duplicate content really boils down to how you want your site to perform online. If you’re concerned about your site’s ranking in search results, you owe it to yourself to routinely monitor the Web for duplicates of content for which you’re ranking. If you spot a version of your original content floating around on some random site, then you should immediately serve the site owner a Digital Millennium Copyright Act (DMCA) notice to have the stolen content removed from the offending site. '''CEO of SEOmoz Rand Fishkin Weighs In''' Rand Fishkin explained in a recent SEOmoz video that a link back to your site is the key to making sure your version is the one that makes it into the index. Other factors play a role as well, such as the authority of the sites on which your article appears and the number of incoming links each site hosting your article has as well. However, the links that you place in the article that point back to your site are what will override all the other considerations and cause your version to win every time. If you publish an article that’s truly exceptional – think something new, unique… a real value-add for other sites in your niche – then some of those sites may be interested in republishing your original content. However, many webmasters shy away from this practice because they fear the wrath of Google. They think that duplicate content on one page will cause their entire site to be penalized. As Cutts clearly explained, this will not happen unless your entire site consists of nothing but duplicate articles. Further, according to Fishkin, “People get really scared and concerned because they’re having duplicate content issues; they’re getting filtered … For example, Seattle Children’s Hospital licensed a lot of its medical news from other websites – they recognize they’re not going to rank – but [they wondered if it would] hurt the rankings on the results for their entire website. They thought, ‘Are we getting penalized? Are we going to have our domain penalized for hosting this content?’ The answer is almost certainly no. Unless almost 95% of your content is duplicate and you seem to be doing other kinds of manipulative things, the engines are perfectly happy to say, ‘Oh, okay, this page is filtered out, and the other pages are just fine.’ So, I wouldn’t be too scared about licensing … other kinds of content from people.” Duplicate content will continue to be a hot-button issue for many years to come. This is especially true because the landscape of the Web is ever changing, and every change ushers in a completely new set of rules. The problem is, when it comes to search engine optimization, no one gets an instruction manual. So we’ll leave you with this thought: hundreds of news media outlets republish content from the Associated Press online and in print every single day, and it’s called “syndication.” It won’t get you ranked – but it still won’t get you penalized. For now.</div> '' Taken from http://site-reference.com'' <br/>
Return to
Google Algorithm Changes Reignite Duplicate Content Debate
.
Navigation
Hosting Issues
Cloud Hosting
Billing Issues
FAQ
Articles
News
Interesting from the web
Search
Subscribe to our newsletters