← Back to Library

AI Won’t Automatically Make Legal Services Cheaper

This essay was published in Lawfare’s Research Paper Series. The official version that should be cited is linked here.

The essay is co-authored with Justin Curl, a third-year at Harvard Law School. Previously, he was a Schwarzman Scholar at Tsinghua University and earned a degree in Computer Science from Princeton, where we first collaborated. You can find more of his writing on AI and the law here.


Many AI leaders believe the technology will transform knowledge work. OpenAI CEO Sam Altman predicts AI systems that are “smarter than humans by 2030,”1 while Anthropic CEO Dario Amodei analogizes future AI models to a “country of geniuses in a data center.”2

Researchers identify legal services as especially vulnerable to disruption by AI.3 And since GPT-4 passed the bar exam,4 much of the profession seems to agree. Law schools have begun incorporating AI into their curricula5 and partnering with AI-focused legal-tech companies to prepare future lawyers for a changing profession.6 One prominent lawyer has argued AI can already replace law clerks and oral argument.7 Another predicts AI could “replace traditional lawyers by 2035.”8

This excitement about AI comes at a time when legal services are expensive. Millions of individuals are priced out of legal assistance,9 while corporate legal fees are increasing steadily, with hourly rates for partners at large law firms now exceeding $2,300.10 Unsurprisingly, many observers see the potential for AI to make legal services more accessible by delivering outcomes at lower costs.11

Our central claim is that advanced AI will not, by default, help consumers achieve their desired legal outcomes at lower costs. We examine the bottlenecks12 that stand between AI capability advances and the positive transformation of the practice of law that some envision. For AI to usher in a world of abundant legal services, the profession must address three bottlenecks: regulatory barriers, adversarial dynamics, and human involvement.

First, unauthorized practice of law (UPL) regulations may limit AI use by consumers (and to some extent lawyers). These laws prohibit nonlawyers from performing legal work.13 Individuals and organizations can face steep fines and criminal liability if courts conclude their systems cross into practicing law,14 forcing would-be providers to either limit their AI tools’ functionality in legal domains or risk enforcement actions. Entity-based regulations—which restrict who can own equity in businesses that provide legal services—restrict ...

Read full article on AI Snake Oil →