Free: 10 Things the Best Law-Firm Website Designs Have in Common
For seven years, Lawyerist has published an annual list of the best law firm websites. Now, you can find out what they have in common.
Still wondering why your legal website or blog is losing traffic? Here’s yet another possibility: Duplicate Content.
In one recent example, Conrad Saam identified a law firm website that had content that was duplicated across as many as 58 other sites.
How does this happen?
Duplicate content can occur in a variety of ways. At one end of the spectrum, it can be created because of a technical “glitch” on your site. For example, perhaps you have WordPress or a plugin configured in a way that creates duplicate page titles each time you publish a new page. It can also happen as a result of configuration mistakes with post categories and tags.
It can also be the result of content scrapers. In a nutshell, spammers scrape content from your site and publish elsewhere.
Often times, it’s the result of purchasing canned content for your website. In other words, you pay someone to build you a website (or multiple websites) and add content and the web designer/developer uses the same page content across many other sites. This appears to be the culprit in Conrad’s example.
What’s the problem with duplicate content?
Aside from appearing cheap and potentially creating an ethics issue (in the case you mislead readers to believe the content is written by you, a lawyer), duplicate content can crush your pages’ visibility in search results.
As Google notes in their Webmaster Tools Help:
Google tries hard to index and show pages with distinct information. This filtering means, for instance, that if your site has a “regular” and “printer” version of each article, and neither of these is blocked with a noindex meta tag, we’ll choose one of them to list. In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results.
There has been some recent confusion about whether duplicate content can actually hurt your site. First, there isn’t a duplicate content penalty, at least not as a “penalty” is typically understood.
It’s important to distinguish duplicated sections of content, like disclaimers, from wholesale page duplication across several sites. In a nutshell, you don’t need to stress about minor duplication that servers a valid purpose. On the other hand, if you’re the target of a scraper or you’ve purchased page content that’s being used all over the place, you’ll probably want to take some corrective action.
What can be done?
The first step is to check whether you have a duplicate content problem. As Conrad recommends, you can simply search for a potentially duplicated paragraph in Google and see if other pages contain the same content. You can also check for duplicate content problems in Google Webmaster Tools.
If the duplication is accidental (i.e. configured misstep), you simply need to identify the offending setting(s)/plugin(s). You can learn more about fixing duplicate content issues like these, here.
In the event that the duplication is more insidious, you’ll probably have a bit of leg-work to do.
If you paid someone to write web content that has been duplicated, you need to check whether the terms of your agreement required that it be unique. If so, get your money back and revise (or kill) the duplicated pages.
If you’re thinking about hiring someone to write content for your site, make sure that they guarantee uniqueness. Of course, you should also maintain exclusive editorial control of everything posted to your site. Outsourcing web content can be fraught with ethical obstacles.