At last year’s LegalTech West Coast Conference, D. Casey Flaherty, Kia Motors America’s in-house counsel, made the provocative assertion that many lawyers are technologically incompetent, and this incompetence leads to wasted time and money. In an attempt to address this, Flaherty developed a legal tech audit (LTA) designed to test basic competencies in working with PDFs, Word documents, and Excel spreadsheets. Flaherty first administered the audit to nine outside counsel firms. According to Flaherty, all of them failed spectacularly.

After using an early version of his legal tech audit in house, Flaherty teamed up with Suffolk University’s Institute on Law Practice Technology and Innovation to make the LTA available for lawyers and law students.

Recently, both Sam Glover and I1 had a chance to take the audit, and here’s what we found.

Lawyers Really Are Terrible at Basic Office Productivity Technology

First, there is no question that Flaherty is right about the fact that many, if not most, lawyers are shockingly bad at the things really are very basic skills, like making changes in Word documents, de-duplicating Excel spreadsheets, and redacting PDF files. Indeed, Flaherty’s own administration of the audit to outside firms makes that clear. Make no mistake — these are skills that should be expected of every attorney. Being able to do things like editing your own documents in order to make them presentable and preparing documents for e-filing are legal work.

That said, the Legal Tech Audit should be considered an office productivity software skills audit, rather than a legal technology audit. This audit won’t test you on whether you know how to share client files securely via cloud storage or FTP, or whether you know how to properly encrypt client files. This is about Microsoft Word, Excel, and Adobe Acrobat, basically.

The LTA website doesn’t give away all the secrets of what is tested, and I won’t do that here either, so I’ll only be talking about a few of the tasks. In Word, you will do things like move text around, delete comments, renumber contract sections, and remove identifying metadata. In Excel, you will be expected to do things like de-duplicate a spreadsheet and perform basic calculations. Finally, in Adobe Acrobat, prepare yourself for redacting documents, combining pages, and preparing for e-filing.

The Audit is Not Ready for Prime Time

Unfortunately, while the idea of a legal tech audit is great and this one does test critical skills, the software for the testing is less than ideal. Flaherty definitely designed this with BigLaw in mind, assuming that the attorneys that take the test will be in a firm that has its own training department, learning management system, and IT staff that will facilitate taking the LTA. Perhaps that setting may change some of the problems both Sam and I had getting into the audit and working through it, but that setting will not fix all the problems with the user experience.

The audit runs only on Windows, and only in IE (or possibly Chrome or Firefox with an IE plugin). You’ll need Microsoft Word and Excel. In theory, you could do this in or something similar, but you’ll likely run into trouble because the audit expects you to do certain things in an exact way. You will also need Adobe Acrobat Professional or a similar PDF program that allows for redaction. (Not all do).

You’ll also need full administrative access to your computer and and network so that you can make the training page a trusted site in IE and let it do things like run executable files on your machine. The audit also records every keystroke you make during the test.

The invitations Sam and I received didn’t specify the system requirements (but invites to others would have) so as a result I took it in less than optimum conditions. I only have Mac and Linux machines at home, so I ran the entire test in Microsoft XP running in Parallels on Mac, which resulted, regrettably, in most of my keystrokes not being tracked. (This is probably due to Parallels.) I used Microsoft Office 2011 for Mac, but I did not have a PDF program installed that does redaction.

Full disclosure: I have no doubt that I did horribly anyway. I don’t need to see my scores to do that. The Excel portion was a killer, and I’m not ashamed to admit I did terrible at it.

Sam took the audit on a Windows 7 machine with Microsoft Office 2010 and Adobe Acrobat X Standard (which does not do redaction, either).

Assuming you have a compatible system (and obviously you will if you take the test in what Flaherty considers an ideal testing environment), you will find that the interface itself is odd. The buttons are detached from the instructions and the program bogged down and stopped running at one point. And there is no back button if you make a mistake and want to go back through previous instructions. This becomes a big issue when the instructions are vague, which they are in several places. For example, the Excel instructions do not specify which of two worksheets you should be modifying. And in the PDF section, you have to attach an exhibit that doesn’t exist in your sample documents. It may or may not be a PDF page you extracted earlier, but the instructions are not helpful.

At the end of the audit, you will get what is essentially a raw score. Your ultimate “grade” on the audit will be expressed as a function of the time it takes you to finish it, including time penalties for things you did wrong. This should arrive within a week, and the report will show you exactly where you went wrong. If you perform a task wrong or skip it altogether, you will be assessed a time penalty that is pegged to how long it would take the slowest person to complete the task.

Our concerns with the audit’s missteps may appear petty, but what those concerns generally mean is that the bad (and outdated — IE, really?) technology driving the audit and the clunkiness the user experiences end up overshadowing the usefulness of the audit itself.

The Audit Could Have a Bright Future — For Some Types of Attorneys

Because we think the Legal Tech Audit is not ready for market (even though it is already available to attorneys who would like to learn how little they know about field codes in Word), that does not mean we think the audit should be written off. First, Flaherty and Suffolk are very responsive to suggestions about necessary changes, and have already modified one part of the Excel portion of the test after Sam ran into errors.

In the right situation — BigLaw, lots of training resources, a decent amount of money and time to spend on the audit — the LTA could become a meaningful way for general counsel to benchmark the efficiency of outside counsel. Law firms will be charged $250 per user for a one-year subscription, and can pay $150 per user for an additional tutorial. Additionally, Suffolk and Flaherty have partnered with training companies that can work with firms to increase proficiency on these skills. The hoped-for (and likely) outcome is that corporations who are shopping around for outside counsel will be able to request that the firm members take the test and provide the scores so that the company can choose a cost-conscious and efficient provider.

This scenario sounds great, but it also highlights why the audit is not for everyone. At the law school I teach at, more than half the students each year have Macs, and will probably stay with that platform if they are able to (even if their firms are officially using Windows). Solos might not have $400 to spare on learning material and are probably more likely to be running what the audit considers non-standard software, such as Google Docs or Preview. Small firms simply do not have the training department or learning management platform that the audit envisions. But for those people who work at firms that fall into the category of firms that Flaherty and Suffolk are targeting, this audit may very well be something you find yourself taking quite soon. Brush up on those Excel skills.


  • 2014-09-09. Originally published. The license costs $250 per user for a one-year license, not $250 a year for a firm-wide license. Thanks for pointing out our mistake, MG.
  • 2014-09-12. Article revised with information we learned after publication.

  1. Both Sam and I would be considered tech proficient under any rubric.