A Google Primer: The History & (Likely) Future


Free: 10 Things the Best Law-Firm Website Designs Have in Common

For seven years, Lawyerist has published an annual list of the best law firm websites. Now, you can find out what they have in common.

A recent post here on Lawyerist highlighted an adjustment that Google has recently made to its algorithm to affect the display order of search engine results pages (SERPs) based upon the authority of the author in addition to the the authority of the actual page or domain. While it is likely that this change will have some effect on Google’s algorithm and will help to make authority more portable and somewhat less domain-centric, this adjustment does not likely represent a tectonic shift in the way the algorithm works or, for that matter, how content is published on the Internet. On the contrary, the adjustment is simply an additional means to accomplish what Google has sought to accomplish from its beginning, i.e. use signals, often social signals, to attribute authority to relevant content that is high-quality and order the SERPs accordingly.

Oligarchies & Directories

If you would, indulge me for a moment while I wax nostalgic: before search engines were widely used, there were directories, like unto Yahoo! Directory, that attempted to categorize websites based upon the analysis of human editors who manually evaluated websites in combination with other more mechanically-generated signals. Directories were/are problematic because manual review by human editors is inherently more biased than many are willing to accept, typically slow, and not very scalable.

Democratizing, Accelerating, & Scaling Quality Analysis

In 1998, while at Stanford, Larry Page and Sergey Brinn published a research paper that proposed a metric that would be a more democratic, faster, and scalable means of determining quality. Rather than using a small, centralized group of “official” human editors to determine the quality, a much larger network of “unofficial” human editors could be used instead. In particular, the paper proposed that the hyperlinks, or links, that point to individual pages or entire domains could be characterized as “votes” and, in theory, the result would be that the SERPs would be ordered by quality because higher-quality resources would receive more links than lower-quality resources. This metric was given the name ‘Pagerank’, patented by Stanford, and licensed exclusively to Google. Although Google has likely implemented substantial modifications to its calculation of PageRank since then, such as abandoning the random surfer model, and has likely added other signals to its algorithm, such as adding very sophisticated machine learning-generated signals included in the recent Panda algorithm update, PageRank still likely carries substantial weight within the algorithm, or at the very least, the theory behind PageRank has likely influenced many of the other signals utilized in the algorithm.

You Give & You Get

What does all of this mean? In short, in order to rank well in Google and other search engines that use similar signals, like Bing, a website needs to publish link-worthy content. It is often not enough to publish content about the products and services that a business offers along with the people who operate the business; rather, some content on a website needs to be compelling enough, or provide enough utility, so that other websites will reference that content on their websites by adding links to it.

Two Kinds of Content

But how are law firms or lawyers who want to rank for business-focused keywords like “Los Angeles personal injury lawyers” supposed to create compelling (read viral) content on the subject? In truth, there is a relatively small amount of compelling text-based content that can be made about “Los Angeles personal injury lawyers”, but there is a much larger amount of non-text-based compelling content that can be made about personal injury law itself, such as videos, infographics, etc.

The good thing about Google is that the original PageRank patent proposed that much of the link-based authority that was attributed to compelling content would, to some extent, be distributed to all of the content on that website. In practice, this means that a website’s more mundane, business-focused content can rank very well if the website has a sufficient amount of high-quality links to more compelling content, relative to the website’s competitors. While Google’s algorithm has certainly changed over the years, it seems very likely that Google still uses some variant of intra-site distribution of authority proposed in the 1998 PageRank paper.

This is not to say that only traditionally viral content, such as videos and infographics, can be compelling enough to garner links; rather, even the mundane, business-focused content on websites can garner some links as well, especially in the legal field where there is a relatively small amount of original, jurisdiction-specific, and law-related content on authoritative websites.

The Long, Hard Slog

Most lawyers understand that establishing offline authority requires hard work, determination, and time – establishing online authority is no different. While select content can generate a sufficient amount of links from high-quality websites so as to propel a site to the top of the SERPs, many (if not most) law firms and lawyers must consistently publish both sufficiently compelling and sufficiently business-focused content over an extended period of time in order to gain enough links so as to reach the critical mass required to rank well in the SERPs.

The More Things Change, The More They Stay The Same

Since 1998, Google has aspired to order relevant content on the SERPs according to quality and authority. In spite of all of the previous changes to Google’s algorithm, and the changes that will be implemented in the foreseeable future, it seems likely that Google still wants to do just that: order the SERPs so as to display relevant content that is more authoritative and higher-quality above relevant content this less authoritative and lower-quality.

Steve Cook is a practicing business lawyer and software engineer in Phoenix. He operates a creative agency that specializes in website design, development, and search engine optimization for lawyers called ESQ Creative.


Get Lawyerist in Your Inbox, Daily

Current Articles
Current Lab Discussions
  • “order the SERPs so as to display relevant content that is more authoritative and higher-quality above relevant content this less authoritative and lower-quality.”

    Exactly. Which is why authorship markup is so important. It helps Google understand who wrote what which ties into authority.

    The bottom line is that HTML 5 gives search engines even more information about the context web pages. Google loves context.

    This will be a tectonic shift. And it will also be a huge competitive advantage for those that are able to adopt it early.

  • Steve Cook

    Gyi, I am not disputing that authorship markup will have some effect in Google’s algorithm. Those that adopt the practice will likely gain some advantage, but authorship markup is one aspect of an incredibly complex algorithm. So I stand by my previous statement that authorship markup is not a “tectonic shift”.

  • ” authorship markup is one aspect of an incredibly complex algorithm.”

    Agreed. No doubt “tectonic shift” is subjective. To me, it’s a significant evolution. More so say than the other daily algo tweaks that Google makes.

    While it is yet to be seen how exactly HTML5 will impact search and the web in general, it does give webmasters, publishers, etc, a completely new set of tools to communicate the context of their content to search engines.

    Thanks for the discussion Steve.