Client intake:
SRA Risk Outlook report: Familiarity is key to AI adoption in law firms
Nick Morgan, Head of Legal Professionals Sales • January 4 2024
In November 2023, the Solicitors Regulation Authority (SRA) published their Risk Outlook report on the use of artificial intelligence in the legal market.
The report highlighted the growing adoption of AI by legal firms due to the range of tools now available. Just a few years ago, AI-powered legal tools were only accessible to large firms. Today, they’re being used by teams of all sizes. This means that smaller, more specialised law firms have the chance to compete with bigger players.
AI adoption has significantly increased as a result; three-quarters of the largest law firms are using it and over 60% of big firms are exploring generative AI, while the SRA references anecdotal evidence that AI usage is also increasing at small and medium-sized firms.
Given that AI tools are prevalent across the corporate world and in our personal lives, tech-savvy clients are coming to expect law firms to be leveraging AI. In fact, a few of our legal customers have said their clients are already asking whether they use AI to support billable work on their behalf.
While AI is being used to enhance and assist in fee-earning work, such as contract creation and review, it’s also exponentially reducing the time taken to run client onboarding and anti-money laundering activities.
Familiarity is key to widespread adoption
The accessibility of AI tools – particularly generative AI tools which don’t require users to have technical understanding – is helping law firms become more familiar with its capabilities and benefits. Moreover, law firms already using AI say it mitigates their concerns that AI could in some way replace their jobs.
For example, while Xapien’s AI-driven due diligence tools support client onboarding decisions, Xapien’s technology does not in any way make decisions on behalf of compliance teams about whether or not they should onboard a client.
Client selection involves nuanced decisions which are dependent on many factors — Xapien automates the process of collecting and interpreting information on these factors so that humans in compliance roles can pay attention to even more nuance in these decisions.
At Xapien, we’re firm believers that the role of AI is to make compliance jobs easier rather than replace roles, which is why Xapien does the heavy lifting to gather information and summarise it into a report which can be used by compliance teams.
Xapien co-founder and Chief Technology Officer, Shaun O’Mahony, points out that successful AI adoption and usage depends on human-AI interaction since generative AI is heavily dependent on a human asking the right question to get a useful answer. Even a simple task like asking AI to summarise an article can yield hugely different results depending on how the question is phrased.
Our latest product update includes a set of predefined queries crafted by the Xapien team, designed to extract the relevant insights that clients need from the information collected – a crucial skill for using generative AI effectively.
AI holds significant benefits
The SRA’s report notes that the advantages from using AI in the legal sector will continue to grow as AI develops and offers more practical and commercial benefits, and as consumers become increasingly comfortable with its use.
The biggest benefits from using AI are improving speed and efficiency in legal work, the report says. For due diligence, the traditional process relies on datasets, lists, and keyword searches, then followed by human analysis. But this approach experiences problems like missed risks, false positives, search engine bias, and the risk of a subject being linked to a marginally related third party.
In contrast, AI automates this process for accurate insights in just minutes, allowing legal teams to focus on higher-value processes. AI tools can go beyond box-ticking exercises (like PEPs and sanctions checks), to analyse all publicly available online data on individuals or corporations — just like Xapien does.
This is a clear example of how AI enables significant benefits mentioned in the SRA report: efficient handling of manual tasks, freeing up resources for more complex strategy-focused work and making faster, well-informed decisions.
This can be particularly beneficial for smaller firms that do not have extensive support teams since it gives legal professionals the bandwidth to focus on challenging tasks.
AI risks surfaced in the report
There are also risks from using AI in the legal sector, which the report mapped out. For instance, hallucinations can cause AI tools to produce incorrect and possibly harmful results.
Inaccurate results are more concerning when there’s an assumption that technology tools are more accurate and precise than humans since this can prevent people from double-checking or even questioning results.
It’s important for legal professionals – and anyone using AI tools – to understand that just like humans, AI systems are susceptible to biases, and if left unaddressed, these biases can result in unfair or inaccurate outcomes.
This is even more relevant in an anti-money laundering context, since AI can make mistakes if it follows incorrect patterns in data, which can happen with anti-money laundering algorithms. To address this at Xapien, we ensure that none of our AI models are trained or algorithms improved using customer input data.
We’ve also built a system of algorithmic safeguards using non-generative, non-machine-learning technology with roots in deep Natural Language Processing (NLP), which acts as a protective layer around the LLMs we use.
Doing this enables us to harness the power of generative AI but with unique control over the quality and accuracy of the output. In practice, it means that all information and insights surfaced by Xapien have been independently checked and verified.
There are also concerns in the legal sector about maintaining client confidentiality when using AI. Not only is there the usual, well-understood risk from using third-party tools which could present additional vulnerabilities, but the nature of AI services adds additional risks since sensitive data may be used for training or may be reproduced as output by generative AI in inappropriate contexts.
Xapien addresses third-party risks by encrypting sensitive data at transit and at rest. Xapien doesn’t need to provide AI algorithms with user or sensitive client information, since it only searches and analyses public information on the indexed web.
An extra level of disconnect between AI processes and user information comes from the fact that traffic to websites from Xapien tools is ‘washed’ with other reports so the report’s subject can’t be traced back to the user.
What could generative AI unlock for your firm?
Xapien provides comprehensive insights on individuals or organisations, automating due diligence from initial search to summarising information into a final report.
It scours millions of registries and web pages across the indexed internet, extracting and contextualising data. Gathering information from various sources, including licensed datasets and official registries, Xapien processes diverse formats, using LLMs to produce human-like written summaries.
Every piece of information is traceable down to the sentence or phrase level, showing the exact source. This traceability ensures that the reports are fully attributable.
Interested in learning more about how Xapien can enhance compliance at your law firm? Fill in the form below to speak with our team.
Monthly learnings and insights to your inbox
Xapien streamlines due diligence
Xapien's AI-powered research and due diligence tool goes faster than manual research and beyond traditional database checks. Fill in the form to the right to book in a 30 minute live demonstration.