Ιn a groundbreaking move, the Solicitors Regulation Authority (SRA) has approved Garfield Law Ltd, the UK’s first fully AI-powered law firm. Founded by a former litigator and a quantum physicist, Garfield Law offers small and medium sized businesses, the use of an AI-powered system designed to assist them in recovering unpaid debts under £10,000 via the small claims process, handling everything from document drafting and procedural filings to case management and trial preparation. Clients can access services such as a legal letter for just £2 or initiate court proceedings for £50.
This development signifies a pivotal shift in the legal landscape, one where technology increasingly performs core legal functions, aiming at modernising legal services and expanding access to justice. But as we celebrate innovation, we must also ask: can access to a machine genuinely be equated with access to justice?
The Appeal of Automation: Affordable, Fast, Accessible
Garfield Law presents itself as a tech-driven solution to a long-standing problem: the legal system’s inaccessibility to ordinary people with limited means. It validates a growing consensus that technology, when designed responsibly and in compliance with legal and ethical standards, can complement the work of lawyers and support the broader justice system.
There is no doubt that Garfield Law meets a real need. Millions of individuals and small businesses forgo legal claims each year because the cost of hiring a lawyer outweighs the value of the dispute, especially for cases under £10,000. In this context, Garfield Law’s model, automating simple legal processes through AI, offers a compelling alternative.
Clients can access a self-service platform online, insert basic case details and have the system automatically generate legal documents and even file claims. The efficiency and affordability are clear: what once took weeks and was in instances unbearably costly, can now be done in minutes and for a fraction of the price. For many, this removes a key barrier to enforcing their rights.
The Danger of a Narrow Form of Justice
While founders claim to offer “access to justice”, others argue that the service might only provide access to process, not to justice. Critics highlight a fundamental issue: the law is not merely a set of forms and procedural steps. It is a human system of negotiation, persuasion, and judgment. Automated tools may serve straightforward claims well, but complex legal issues still require human judgment, strategy, and advocacy – areas where AI falls short.
Garfield Law does not offer personalised legal advice. It does not engage with the complexity of a client’s individual situation. Its AI system is designed to handle standard, uncontested claims, not to weigh evidence, interpret nuance, or guide clients through unexpected complications. This means that vulnerable or atypical users may be poorly served.
Furthermore, AI tools produce results without explaining how they reached their conclusion. This makes it difficult for clients, lawyers, or regulators to understand or challenge the outcome. It raises concerns about accountability, fairness, and the right to appeal, especially when the decisions affect someone’s rights or financial situation. Without meaningful human interaction, clients rely on a system they cannot question or fully understand.
In the long term, there is a risk that AI-driven legal services could widen inequality in society by creating a two-tier justice system. Low-income individuals may be offered low-cost, standardised solutions while wealthier clients continue to access personalised, strategic legal advice.
It should be noted, however, that according to the SRA, before authorising Garfield Law, the SRA considered the firm’s processes. The SRA sought reassurance that appropriate processes are in place to quality-check work, keep client information confidential, safeguard against conflicts of interest, and manage the risk of ‘AI hallucinations’. Furthermore, under the SRA rules the solicitors in charge will ultimately be accountable and responsible for all system outputs and any issues that may arise.
Rethinking Legal Practice in the Age of AI
Garfield Law’s approval by the SRA suggests a willingness to embrace a new model of delivering legal services that may not rely on the traditional lawyer-client relationship.
Automation may handle routine tasks more efficiently, reduce legal costs, and extend services to those previously priced out of the system. It may also push the profession to modernise, adopt tech literacy, and develop hybrid models where lawyers and AI collaborate.
It is a valuable tool but not a complete solution. Whilst it delivers access to legal processes, it does not necessarily provide access to justice in its broader, ethical sense. Justice involves listening, understanding, advocacy, fairness, and human accountability – these are qualities that automation alone cannot offer.
Conclusion
Garfield Law’s approval marks an important milestone but also presents a critical test. In general, if AI is used responsibly within well-defined frameworks, protective measures, and transparent accountability structures, it has the potential to enhance access to legal services. Without these safeguards, however, it risks reducing justice into a transactional, error-prone process.