Law tech: AI change agents


Agentic AI refers to artificial intelligence systems and models that carry out a sequence of activities to fulfil complex functions without requiring separate prompts for each task. An AI agent can be programmed to handle a process or workflow that involves multiple tasks autonomously without human interaction. 

Joanna Goodman

One reason why agentic AI is catching on so quickly in legal is that it delivers on a familiar concept. Keith Feeny, chief technology officer at Hill Dickinson, which was recently in the news for restricting its lawyers’ unauthorised use of generative AI (GenAI), observes: ‘Agentic AI feels very much like an evolution of legal process engineering, moving from rules-based automation to supervised and ultimately autonomous members of the team.’

Digital colleagues

Peter Lee coined the term ‘legal engineer’ when he founded Wavelength, the first legal engineering firm. Wavelength was eventually acquired by Simmons & Simmons where Lee is a partner. The firm is introducing agentic AI and the concept of digital workers into its legal and compliance teams, using its internal large language model, Percy – named after one of the firm’s founders – and Berlin-headquartered agentic AI company Flank AI.  

‘In the last year, we started seeing agentic AI as a way of augmenting our legal and compliance teams,’ Lee explains. ‘Agentic AI is disaggregating entire processes and removing the need for human interaction. This is not the same as document automation or robotic process automation because the level of engagement with an agent means it’s more like having a digital co-worker.’ Lee uses a non-disclosure agreement (NDA) agent as an example. ‘There are a couple of ways of setting up an NDA agent: you can email the agent and ask it to create an NDA using your standard template, which is like dealing with a human assistant. You can also send the agent third-party NDAs, contracts and documents to mark up with comments, which it can do because it has been trained on your firm’s playbooks and recent examples. It gives the impression of working with a human colleague.’

‘Agentic AI feels very much like an evolution of legal process engineering, moving from rules-based automation to supervised and ultimately autonomous members of the team’

Keith Feeny, Hill Dickinson

Lee believes agentic AI could well disrupt the alternative legal service providers (ALSPs) market because ALSPs typically offshore this type of routine work. ‘Agentic AI will change the skills needed in legal and compliance teams as well as deliver the savings promised by robotic process automation,’ he says. Agentic AI is also scalable – in terms of scope as well as workload. ‘It’s not just about contracts,’ Lee adds. ‘You might use it for repeatable processes, filling in forms, dealing with policy and compliance queries, or starting new matters.’

Keith Feeny, Hill Dickinson

However, the myriad of possibilities can also create uncertainty around agentic AI, particularly where it threatens jobs. This is where the legal engineers come in. Lee says: ‘You need people who can interface with the technology and understand what lawyers and compliance functions are trying to achieve, while integrating that and the relevant playbooks into processes and workflows. It’s all very well having a fancy new technology, but if you don’t integrate it with processes and people, you will fail to realise its benefits.’

Because agentic AI is so scalable, it needs to be configured carefully. Simmons & Simmons’ GenAI model Percy draws on several large language models and ‘was configured specifically for the type of legal work we do at Simmons & Simmons in terms of temperature and so on’, Lee says. ‘We can upload client data because it is a secure application within the firm’s systems and it performs very well for the tasks we need it for, but it’s not like ChatGPT, Gemini or Claude. You can’t ask it to create a film poster – it’s pointed at NDAs, contract management, legislation and legal regulation.’

High stakes

Business Insider’s list of ‘43 start-ups to bet your career on in 2025’ includes three legal AI companies that have raised significant investment. San Francisco-based GenAI platform Harvey has raised $500m – including $300m in Series D funding earlier this month, with investors including REV, a venture capital partnership backed by LexisNexis parent company RELX Group. Swedish agentic AI start-up Legora raised a relatively modest $37m but has seen meteoric growth. London-headquartered Robin AI’s contract review platform has raised a total of $71m, including two funding rounds in 2024. Also in February, Legal AI pioneer Luminance raised $75m in Series C funding, having launched its own AI agents last year. While the money shows no sign of running out, Q1 has also seen consolidation in the legal tech space with a series of acquisitions, notably UK legAl workflow vendor Peppermint Technologies was acquired by US legal tech giant Litera.

 

Hill Dickinson

Standing out from the crowd

The record amount of funding going into legal AI makes it an obvious route for legal tech start-ups and incumbents. However, an intense and crowded market makes it hard for new entrants to gain traction. Swedish agentic AI start-up Legora – known as Leya before its recent rebranding – has bucked the trend. Just a year after graduating from Silicon Valley start-up incubator Y Combinator, it is being rolled out across Bird & Bird’s global offices and has just launched a collaborative pilot project with Mishcon de Reya.

Chris Williams, Legora’s head of strategic partnerships and community, explains that Legora’s agentic AI platform was built in partnership with its early clients to help lawyers build efficiency into workflows while retaining a feeling of uniqueness and autonomy. ‘Instead of hiring lawyers to find use cases and then developing a product to address those use cases, Legora set out to build flexible functionality,’ he says. ‘Rather than selling tickets to a destination, we want to go with lawyers on the journey to discover how AI can best impact their work.’

What does this mean in terms of technology? ‘Legora is agentic AI with the ability to understand the legal context and break down tasks in the same way as a lawyer would do,’ Williams says. ‘The agent will select from various tools depending on the task. So if you want to summarise a new regulation, it will use its analysing tool. But if you are involved in litigation and want to build the chronology of a bundle of documents, it will choose its gathering tool. If you want to translate a document, it will automatically use its DeepL model. It will select the right model for the job based on the instructions you give it.’

While Williams is not focusing on use cases, he explains how Legora saved a lawyer time by automatically populating a due diligence template using the information in a share purchase agreement.

He does not envisage agentic AI replacing lawyers because they still have to check its output, but it will change what they do: ‘Effectively, instead of taking the exam, you’re marking the exam. Lawyers will still be able to put a personal stamp on their work, with the assurances that offers their clients. [By taking on routine tasks] it will also free them up to build deeper client relationships and develop business – which is often one of the biggest hurdles for lawyers aspiring to become partners.’

GenAI dilemma

This month also saw the first public pushback from lawyers keen to embrace the latest AI when an email distributed by national firm Hill Dickinson restricting the unauthorised use of publicly available GenAI platforms was ‘leaked’ to the BBC.

Feeny clarifies that Hill Dickinson is not rejecting AI per se, but rather making sure that its use within the firm is controlled in a way that protects firm and client data: ‘We have put in place an AI policy that sets out guidelines on access to and use of AI tools for work-related purposes. Measures to enforce that policy include limiting access to publicly available AI solutions while enabling access to specific AI solutions to support our real estate and discovery teams. Alongside this, we are also implementing the first phase of our longer-term AI enablement strategy that includes firm-wide training, communications and access to Microsoft Copilot.’

The Hill Dickinson story echoes the ‘shadow IT’ problem that organisations experienced during the pandemic – people using personal devices and online accounts for work purposes without the knowledge of the IT function, increasing the risk to enterprise systems and data.

More optimistically, it also reflects how far the legal sector has come with respect to attitudes to technology, specifically AI. This is borne out in a key finding of a January LexisNexis survey of 887 lawyers and legal support workers in the UK and Ireland: lawyers want legal tech, and AI is a significant factor in the war for talent. ‘If a firm failed to embrace AI… 11% [of lawyers surveyed] said they would consider leaving. This escalated at larger firms, with… one in five saying they would consider leaving,’ the survey said.

Having encouraged lawyers to embrace AI, how can firms manage its use? Agentic AI is helpful, providing flexible tools that are designed for legal workflows and processes. Lee agrees, while acknowledging the LexisNexis findings: ‘Because we developed our own large language model, which is configured for legal work, it’s unlikely that people would use ChatGPT or DeepSeek instead. However, if it was turned off or unavailable, it would be as disruptive for people as losing our document management system.’

However, Lee emphasises that governance is key to successful AI implementation: ‘To that end, it is important to focus on AI literacy, particularly as from 2 February it became mandatory under the EU AI Act. If last year was about organisations trying to work out what GenAI is and how they can use it, this is the year of thinking about how we can use it responsibly.’



Source link

Add a Comment

Your email address will not be published. Required fields are marked *