Big Unanswered Questions on Big Data and Ethical Obligations

As the use of big data continues to grow, there is plenty of speculation about how data analytics will shape the evolution of the legal profession. There is more uncertainty about the ethical obligations that govern the use of that data.

Many experts point out the amended Comment 8 to Model Rule 1.1 as the golden rule for lawyer’s technological competence.

Lawyers don’t merely have a duty of technological competence, however. Too often overlooked are other provisions in the Model Rules that have yet to be resolved by or, in some cases, even addressed by the ABA and state bar associations. A recent article by Renee Knake, a Michigan State University law professor and co-director of the Kelley Institute of Ethics & the Legal Profession, broke down some of the key Model Rules impacting the ethics of big data analytics in law.

Big Data is Getting Bigger

Previously unanticipated methods of finding and contacting clients are now becoming accepted practice. Knake uses the example of Ohio’s 2013 approval of lawyers using police records to send solicitations to potential clients via text message. Although Ohio is by no means the first nor the only state to broach the use of police, accident, or even whistleblower data for identifying possible clients, the ability to take a phone number straight from a report and immediately text a potential client streamlines the process in a revolutionary way.

Competence transcends Rule 1.1 and seeps into other rules. ABA amendments to the Model Rules of Professional Conduct provide additional guidance on the need for lawyers to safeguard client data and comply with laws that guide data privacy or impose notification requirements on electronic information.

Even placing compliance in the hands of an expert holds the potential for ethical snafus if done blindly. The rules require lawyers to have the “requisite knowledge” of the technology used and a level of supervision or involvement in processes used to analyze and secure data. Combine that with Model Rules 5.1 and 5.3, which require lawyers to supervise and take responsibility for work performed by outside vendors, who are often an essential part of data analytics. While visits to observe were once sufficient, now the identification, collection, and analysis of data in a digitized world require that an attorney more skillfully navigate the security, privacy, and integrity of data.

That is, at least to the extent that it is reasonable. The endless considerations relating to data privacy, security, and competence may seem overwhelming, but the ABA and many state bar associations recognize that rules for the roles and responsibilities of legal professionals need to be realistic for all lawyers—not just the industry’s tech whizzes.

Comments 18 and 19 to Rule 1.6 provide a nonexhaustive list of factors that should be considered in determining what  steps are “reasonable” for a lawyer to take to prevent unintended leaks or unauthorized access to client data:

  • The “sensitivity of the information.”
  • The “likelihood of disclosure if additional safeguards are not employed.”
  • The “cost of employing additional safeguards.”
  • The “difficulty of implementing the safeguards.”
  • The “extent to which the safeguards adversely affect the lawyer’s ability to represent clients (e.g., by making a device or important piece of software excessively difficult to use).”

Some states have followed the ABA’s lead, adopting some or all of the language in the comments to Model Rule 1.6.

Revolutionary Uncertainty

Still, many pressing questions remain unanswered. Data access, ownership, the anonymity of sensitive data, consent to the use of client’s personal data, data storage, and cybersecurity are only the tip of the technological iceberg. This uncertainty has left some legal professionals leery of the increasing role of data analytics in the industry.

Knake discusses a recent Huron Legal survey of 129 legal professionals from the private and public sector attending the LegalTech New York conference. For a sample drawn from attendees at a legal technology conference, some of the results were surprising. Only 16 percent said analytics are being used to negotiate legal fees. Only a third said they use analytics for litigation management such as case strategy and less than one in four used analytics for “law department management” or even budgeting.

Most strikingly, 10% of professionals surveyed said data analytics was not being applied within their organization all. Among the challenges to implementing analytics, respondents claimed there is a “lack of accessible data,” questioned the “quality of data,” and had reservations about the cost. Nearly one in ten saw data analytics as a “threat to the practice of law.” Overall, however, the survey found that the use of big data analytics was on the rise and that the majority of professionals surveyed (64%) at least rely on data analytics in e-discovery. Considering the emerging trend among state bar associations to mandate that lawyers have a level of competence in e-discovery, that number is likely to continue to grow in years to come.

Law and technology have become inextricably entwined. Different types of data at each stage of representation presents unique obligations and challenges. That means ethics rules need to change so they are as advanced and responsive as the technology they govern.

1 Comment

  1. Avatar Rodney Allen Hampton says:

    The problem with big data is the potential for de-anonymization. Depending on the data sets involved, individual identities can be compromised. I doubt many lawyers are even familiar with concept of de-anonymization, let alone equipped to supervise big data ‘work’ and take reasonable measures to safeguard client confidentiality.

Leave a Reply