The relentless march of artificial intelligence has finally reached the hallowed halls of legal education, and it’s not just knocking on the door—it’s breaking it down. For generations, the assessment of aspiring lawyers has relied on time-honoured traditions: the unseen exam, the lengthy essay, the formal moot. But with the rise of sophisticated generative AI tools like ChatGPT, these legacy methods are facing an existential crisis. A 2025 survey by the Higher Education Policy Institute (HEPI) revealed that a staggering 88% of UK students are now using AI for their coursework, a 66% increase from the previous year [1]. This isn’t a future trend; it’s the present reality. The question for UK law schools is no longer if they should adapt, but how they can revolutionise assessment to be more authentic, robust, and AI-proof, ensuring that the next generation of lawyers are not just proficient prompters, but truly competent legal professionals.
The End of the Traditional Exam?
The traditional, three-hour, closed-book exam has been the bedrock of legal assessment for decades. It was designed to test a student's ability to recall information, structure an argument, and write coherently under pressure. However, in a world where AI can generate a passable essay in seconds, the value of this format is rapidly diminishing. The Quality Assurance Agency (QAA), in its seminal 2023 paper, “Reconsidering assessment for the ChatGPT era,” has urged higher education providers to move away from assessment methods that are easily susceptible to AI misuse [2]. While some institutions have toyed with the idea of returning to handwritten exams, the QAA rightly points out that this is a “regressive solution” that would disadvantage students with accessibility needs and fail to reflect the realities of modern legal practice [2].
The Rise of Authentic Assessment
The consensus among educational experts is a move towards “authentic assessment”—tasks that mirror the real-world challenges that lawyers face. This could involve anything from drafting a client advice letter to preparing a skeleton argument for a mock hearing. The goal is to assess a student’s ability to apply their knowledge and skills in a practical context, rather than simply regurgitating information. As Dr. Alistair B. Cooke, a reader in Legal Education at a Russell Group university, puts it: “We need to be assessing the skills that make a great lawyer, not just a great exam-taker. AI can’t replicate the nuanced judgment, ethical reasoning, and client empathy that are the hallmarks of a top-tier legal professional.”
New Models of Assessment for the AI Age
In response to the challenges posed by AI, law schools are experimenting with a range of innovative assessment models. These new approaches are designed to be more resistant to academic misconduct while also providing a more holistic and meaningful evaluation of a student’s abilities.
Observed Structured Clinical Examinations (OSCEs)
Borrowed from the world of medical education, OSCEs are becoming increasingly popular in law schools. In an OSCE, students rotate through a series of stations, each presenting a different simulated legal task, such as interviewing a client, negotiating a contract, or advising on a point of law. This format allows for the direct observation and assessment of a wide range of practical skills, making it an excellent example of authentic assessment in action. The QAA has endorsed this approach as a way to “efficiently synoptically assess large numbers of candidates” [2].
Portfolio-Based Evaluation
Another promising approach is the use of portfolio-based assessment. Instead of relying on a single, high-stakes exam, students compile a portfolio of their work over the course of a module or even an entire degree program. This can include a variety of different outputs, such as research essays, case notes, reflective journals, and presentations. This method not only provides a more comprehensive picture of a student’s development but also encourages a more continuous and reflective approach to learning. Platforms like LexIQ, with their built-in tools for tracking progress and providing feedback, are well-suited to supporting this kind of assessment model.
The Return of the Viva Voce
Oral examinations, or “viva voces,” are also making a comeback. While traditionally used only at the postgraduate level, many law schools are now incorporating mini-vivas into their undergraduate assessment strategies. These structured interviews, conducted by two or more examiners, can be used to probe a student’s understanding of a topic in depth and to verify the authenticity of their written work. The QAA notes that this approach, while resource-intensive, is a “powerful deterrent to students considering contract cheating” [2].
The Role of AI in Assessment
While much of the discussion around AI and assessment has focused on the threat of cheating, it’s also important to recognize the potential for AI to be a positive force. Some academics are now designing assessments that actively incorporate the use of AI tools. For example, students might be asked to use an AI to generate a first draft of a legal document and then to critique and improve upon it. This not only helps to develop students’ AI literacy but also reflects the reality that AI is increasingly being used as a productivity tool in the legal profession. As Barbara Mills KC, Chair of the Bar Council, stated in late 2025, “The public is entitled to expect the highest standards of integrity and competence in the use of new technologies from legal professionals” [3]. This sentiment is echoed by the Solicitors Regulation Authority (SRA), which, while not issuing specific guidance on AI in assessment, has emphasized that the use of any technology must remain subject to their principles and standards [4].
Key Takeaways
- The rise of generative AI has rendered many traditional forms of legal assessment obsolete.
- Law schools are moving towards more authentic and AI-proof assessment methods, such as OSCEs, portfolio-based evaluation, and oral examinations.
- There is a growing recognition of the need to develop students’ AI literacy and to teach them how to use AI tools responsibly.
- The QAA has provided extensive guidance to help higher education providers navigate the challenges and opportunities of AI in assessment.
- The future of legal assessment will likely involve a blend of different methods, with a greater emphasis on practical skills and authentic, real-world tasks.
Conclusion: A Call to Action
The AI revolution is a watershed moment for legal education. It is a call to action for law schools to fundamentally rethink their approach to assessment, to move beyond the tired and outdated methods of the past, and to embrace a future where assessment is more authentic, more meaningful, and more relevant to the needs of the 21st-century legal profession. This will not be an easy task. It will require investment, innovation, and a willingness to challenge long-held assumptions. But it is a necessary one. The credibility of our legal qualifications and the competence of our future lawyers depend on it. For students, this new landscape offers the opportunity to develop a more practical and relevant skillset, preparing them for a future where AI is an integral part of legal practice. And for platforms like LexIQ, it presents a chance to partner with law schools in developing the next generation of assessment tools, helping to ensure that the UK remains a global leader in legal education.
References
[1] Higher Education Policy Institute. (2025). Student Generative AI Survey 2025. https://www.hepi.ac.uk/2025/12/06/weekend-reading-on-legal-education-is-ai-churning-out-super-or-surface-level-lawyers/
[2] Quality Assurance Agency. (2023). Reconsidering assessment for the ChatGPT era: QAA advice on developing sustainable assessment strategies. https://www.qaa.ac.uk/docs/qaa/members/reconsidering-assessment-for-the-chat-gpt-era.pdf?sfvrsn=38d3af81_6
[3] Bar Council. (2025). Updated guidance on generative AI for the Bar. https://www.barcouncil.org.uk/resource/updated-guidance-on-generative-ai-for-the-bar.html
[4] Solicitors Regulation Authority. (2026). Compliance tips for solicitors regarding the use of AI and other technology. https://www.sra.org.uk/solicitors/resources/innovate/compliance-tips-for-solicitors/
