Main content

How will generative AI impact legal services? It’s all about responsibility, say lawyers

Chris Middleton Profile picture for user cmiddleton December 8, 2023
Summary:
A Westminster policy conference about lawyers’ increasing use of AI opened a Pandora’s Box of issues. What are they? And what happens next?

Business, Technology, Internet Law Lawyer Legal © putilich - Canva.com
(© putilich - Canva.com)

Back in June 2023, two Manhattan lawyers were fined when six fictitious legal cases, generated by ChatGPT, were cited in a legal brief for a case against a Colombian airline. The judge observed that there was nothing “inherently improper” about lawyers seeking AI assistance when preparing lawsuits, but noted that the pair had:

Abandoned their responsibilities when they submitted non-existent judicial opinions, with fake quotes and citations created by the artificial intelligence tool ChatGPT, then continued to stand by the fake opinions after judicial orders called their existence into question.

The lawyers responded that they had acted in good faith, but had simply “[failed] to believe that a piece of technology could be making up cases”. 

If true, that claim is as troubling as their actions. Barely six months on from ChatGPT’s public launch, two experienced lawyers were already trusting it to generate facts at the touch of a button, in a sector that is largely built on citation, precedent, and case law. Not only did they did not check their sources, but they also challenged the judge’s ruling against them. A neutral third party might see that as a staggering combination of naivety, arrogance, and irresponsibility. 

Despite all this, a Westminster Legal Policy Forum this week heard that AI will be critical to the legal profession’s success and survival, so the industry must seize the opportunities it presents. 

So, what are those opportunities?

A graphic presented by international practice Addleshaw Goddard revealed a huge number of potential use cases, with generative AIs and Large Language Models helping lawyers to draft and review documents, policies, and/or evidence, or to explain rules, regulations, and Ts & Cs, across the whole spectrum of legal work, including corporate, commercial, financial, and litigation.

At this point, we should remind ourselves that much of the legal sector’s response to AI thus far is due to the success of a single tool, ChatGPT, which has existed in the wild for just one year. Not only that, but its makers recently fired the CEO for being, in the board’s view, untrustworthy, then rehired him and fired the board instead – a board whose sole purpose was ensuring the responsible, ethical development of AI. 

Viewed in this specific light, it is hard not to picture millions of lost marbles rolling across courtroom floors or the plush carpets of City firms. So, is this a collective abandonment of common sense, an industry-wide desire to place unpredictable tools at the heart of a sector, legal services, whose stock-in-trade should be auditability, transparency, and trust? (Or, at the very least, checking its workings?)

Not a bit of it, according to Paul Caddy, Head of Insight at London law firm Shoosmiths. The profession should create a new culture of embracing change, he said:

Not, ‘What do we want? Gradual change. When do we want it? In due course.’ Change is happening now! So, we need to win over hearts and minds. Law firms are changing, and AI will increase the pace of change. There is an expectation that AI will be used in law firms.

A slide in his presentation added: ‘Why should I waste my time doing dull work, which AI could do?’

A good question. But the ‘dull work’ of the legal profession tends to include checking precedents and case law, reviewing documents, and preparing evidence – tasks that demand accuracy and citation, as that firm of New York lawyers discovered. Just prompting an AI was, and is, surely not an adequate professional response.

Shobona Iyer is Vice-Chair of the Legal Services Committee at UK’s barristers organization, the Bar Council. She presented 2023 LexisNexis research showing that an average of ten percent of law firms are already using generative AI – an average brought down by relatively low use by The Bar and lawyers in the public sector (six percent in both cases). 

Leading the charge in 2023 AI adoption are in-house lawyers within large corporates (17% current usage), followed by small law firms (13%), large law firms (12%) and medium-sized practices (also 12%, but with less daily usage).

However, the industry’s long-term response looks very different. According to LexisNexis, nearly two-thirds (64%) of large law firms are actively researching the use of generative AI, with in-house lawyers (47%) medium-sized practices (33%) and small firms (31%) some distance behind. The Bar (20%) and the public sector (14%) are the real laggards once again, with the former doubtless fearing reputational risk, and the latter perhaps mired in bureaucratic processes.

However, Iyer explained:

The Bar is quite a small amount, but AI is there on the radar. And I think the reason for this is principally the fact that we are a self-employed profession. We work alongside other law firms, so we will be using AI as well. That comes along with our instructing solicitors and other professionals in the UK and abroad.

Then she added:

It is [barristers] duty to fight for justice, but with excellence and integrity. And that includes effective dispute resolution in all forms. Not just litigation, but also arbitration adjudication, and mediation, as well as negotiation and dispute mitigation. 

So, we are looking at what's efficient and productive, but without compromising on the quality of justice and fairness. So, we are looking at these systems, but I think barristers are an overcautious bunch.

The human touch and pricing options

So, what does the wider profession believe AI will be used for? 

According to LexisNexis, the overwhelming focus will be basic research, cited by two-thirds of all respondents, followed by drafting documents (59%) and analyzing them (47%). (One Westminster Forum delegate described generative AI as “the cold-start tool to get you going on a subject, not the finishing touch”.)

Nearly one-third (32%) of the profession see conducting due diligence as an appropriate use of generative AI. This suggests that the law will need to rule on whether the use of tools that are, emphatically, not designed to be “truth machines” – to quote an AI CEO – should even be considered due diligence. (Who does due diligence on the AI, especially if it has scraped unlicensed content?)

Meanwhile, 20% of lawyers see AI as critical to understanding new legal concepts – which, arguably, begs the question of what lawyers will spend their time doing in this new world. One answer is providing the human touch: those soft human skills that AIs, currently, are unable to provide. Might lawyers eventually become the public relations arms of intelligent machines?

One man who might know the answer is Ian Jeffrey, Chief Executive of the Law Society – the professional organization that represents solicitors in England and Wales. For him, it is not so much what AI presents today that will be important, but what it may be capable of in future.

He said:

We had a huge tipping point 12 months ago, and to some extent, we are still at an early stage in the adoption of these tools. They're not yet perfect, and we haven't understood everything that they can do; clearly the pace of development is very rapid. But I think we should all now have a very good sense of what is going to be possible – and just how quickly it may become possible. 

There are three timescales. One is the immediate or very near-term. Another is the mid-term, once the current wave of technologies has been assimilated and we start to see more of the massive, customized services that are being spoken about [we explore access to justice in a second report from the event]. 

But the third is the more distant future. There, we get into questions about how quickly people might expect to see some form of artificial general intelligence [AGI], as well as tools that are highly capable of performing certain tasks. 

Those dimensions will change around the capability and capacity of systems, and the way in which they allow different processes for work delivery. They are going to then feed into changes in the roles or delivery of legal services themselves.

So, as a representative body for solicitors, we're not under any misapprehensions. The nature of what it is to be a solicitor is likely to be significantly changed over time as a result of these tools. Nor are we under any misapprehension that we won't see changes to the organizational design of member firms, and of corporate and other in-house legal teams. 

That in turn will feed through to different choices for consumers and clients, around the different business models serving their needs. And quite a bit of discussion about pricing.

Indeed. Might legal services become cheaper and more widely accessible, because of AI? We will explore those questions in a separate report. 

But for now, does AI pose a different kind of challenge to the industry? Might consumers begin using AI for the bread-and-butter legal tasks that currently occupy local firms: conveyancing, probate, minor disputes, affidavits, and so on: tasks that are vital, but often slow and expensive? And if so, might that present another challenge to the sector: missing rungs on the career ladder for junior lawyers in terms of acquiring real-world skills and experience?

James Hutt, Founding Consultant at consultancy Paradigm Junction, made an excellent point:

With AI, quality increases with workload, not decreases as it does with junior lawyers. But AI can’t do the soft human bit.

Solicitor Shaun Jardine, who runs another legal transformation consultancy, Big Yellow Penguin, proposed a new model for the industry:

We will get into a situation where [the legal sector] will offer clients service-level choices. Think gold, silver, bronze; think car washes. 

We will have clients who say, ‘I'd be happy with the bronze option’, which will include elements of Gen-AI delivered through portals. And there will be clients who say, ‘I would like you to deliver a very personalized service’. 

So, lawyers have got to get into the habit of thinking about what clients actually value. Think of your massive gold or platinum service level, then trim it down to silver and bronze and give clients that choice. Is it partner led? Or is it newly qualified lawyer led? Is it going to be delivered tomorrow, or is it going to be delivered next week?

It won’t just be about the cheapest option, he said. It will also be about minimizing friction and drudgery for the client: 

One firm I worked with had a client who wanted employment contracts. They said to that client, ‘I will send you some templates that you can fill in yourself and I will check them for £2k. Or I will do it all for you myself in three weeks’ time for £5k. Or I will do it for you myself tomorrow, for £8k. They knew they could have it cheaper and slower, but the client still chose the £8k option.

However, this doesn’t address the looming problem – for the industry, at least – of AI doing things both quicker and cheaper. That promises real benefits to consumers, if not (for the foreseeable future) to wealthy clients that are engaged in expensive litigation or corporate takeovers.

Akber Datoo is founder and CEO of D2 Legal Technology. He said:

I think it's more nuanced than simply bronze, silver, gold, platinum. I'd like to think that we are guiding clients and giving them options. I think we will have done something wrong if this just becomes a race to the bottom in terms of price.

He added:

It will be super important that we develop our staff. Yes, there is a worry that the things I learned as a trainee solicitor are just not going to be so in demand. But then, perhaps it's about those trainees learning from oversight of the AI.

It's got to be a lot more involved in terms of how we package this up and take responsibility, because there's also the opportunity to take the upside here, in terms of what we provide as value.

My take

A topic with far-reaching consequences, such as the AI effectively being in charge, served by experienced humans with soft skills. 

Another might be: why study law in future – or any other professional degree – if AIs have been trained on every nuance, point, and precedent? (Might AI begin devaluing the very notion of high-value professional skills?) And what if a US AI has been trained on American case law, but is widely adopted in the UK, Europe, or India?

And there is a Pandora’s Box of further questions. Such as, how to manage trust when a lawyer uses AI. And what if the lawyer blames the AI when things go wrong – which is what happened in the Manhattan case? Cue reputational damage or legal cases collapsing.

Ellen Lefley is a lawyer for law reform and human rights charity, JUSTICE. She told delegates:

Use of AI then blaming the AI is bad lawyering! There's a very clear point here, which is the ethical responsibilities of the lawyer – whether you're using analog books or AI – to take responsibility for the output. And there's a real point around transparency and accountability being interconnected and interdependent principles. Accountable lawyers are ones who are transparent about their AI use.

Loading
A grey colored placeholder image