*/
We have all heard the stories about AI-hallucinated cases finding their way into skeleton arguments and written submissions, but until relatively recently spotting one in the wild was a rarer occurrence.
Mangled case citations have been a feature of legal research enquiries for as long as there have been cases to cite. The seasoned law librarian can untangle jumbled years and volume numbers, decode anagrammed abbreviations, spell-check mistyped or misheard party names and, more often than not, locate your desired case.
Hallucinated citations, on the other hand, present an entirely different challenge. At first glance they seem legitimate but, despite meticulous efforts to track them down, they remain frustratingly elusive.
Take, for example, a recent encounter we had with a dubious citation during the course of an enquiry. After exhausting all available tools to decode and locate the case, our suspicion grew: could this be a rogue hallucination? The deeper we dug, the clearer it became that no such case existed, at which stage we turned to the likely source, Generative AI.
For a law librarian, encountering a hallucinated citation is a real Scooby Do reveal moment, so we excitedly entered prompts into various Generative AI applications – both free and paid – asking them to summarise our hallucinated case. The results were intriguing:
These examples are given not to suggest that any particular Generative AI tool should be preferred. Rather, they highlight that interrogation is key.
While the library remains the perfect starting point for legal research, with up-to-date practitioner texts and dedicated legal databases, in reality not everyone will have immediate access to such a resource and will instead begin their journey with readily available (and often free) Generative AI applications. These tools are adept at producing convincing imitations of case references and summaries, presented to the querent with an unruffled confidence that can mislead.
Keep your research on the right track with these simple steps:
We have all heard the stories about AI-hallucinated cases finding their way into skeleton arguments and written submissions, but until relatively recently spotting one in the wild was a rarer occurrence.
Mangled case citations have been a feature of legal research enquiries for as long as there have been cases to cite. The seasoned law librarian can untangle jumbled years and volume numbers, decode anagrammed abbreviations, spell-check mistyped or misheard party names and, more often than not, locate your desired case.
Hallucinated citations, on the other hand, present an entirely different challenge. At first glance they seem legitimate but, despite meticulous efforts to track them down, they remain frustratingly elusive.
Take, for example, a recent encounter we had with a dubious citation during the course of an enquiry. After exhausting all available tools to decode and locate the case, our suspicion grew: could this be a rogue hallucination? The deeper we dug, the clearer it became that no such case existed, at which stage we turned to the likely source, Generative AI.
For a law librarian, encountering a hallucinated citation is a real Scooby Do reveal moment, so we excitedly entered prompts into various Generative AI applications – both free and paid – asking them to summarise our hallucinated case. The results were intriguing:
These examples are given not to suggest that any particular Generative AI tool should be preferred. Rather, they highlight that interrogation is key.
While the library remains the perfect starting point for legal research, with up-to-date practitioner texts and dedicated legal databases, in reality not everyone will have immediate access to such a resource and will instead begin their journey with readily available (and often free) Generative AI applications. These tools are adept at producing convincing imitations of case references and summaries, presented to the querent with an unruffled confidence that can mislead.
Keep your research on the right track with these simple steps:
Chair of the Bar finds common ground on legal services between our two jurisdictions, plus an update on jury trials
A £500 donation from AlphaBiolabs has been made to the leading UK charity tackling international parental child abduction and the movement of children across international borders
Marie Law, Director of Toxicology at AlphaBiolabs, outlines the drug and alcohol testing options available for family law professionals, and how a new, free guide can help identify the most appropriate testing method for each specific case
By Louise Crush of Westgate Wealth Management
Marie Law, Director of Toxicology at AlphaBiolabs, examines the latest ONS data on drug misuse and its implications for toxicology testing in family law cases
An interview with Rob Wagg, CEO of New Park Court Chambers
With at least 31 reports of AI hallucinations in UK legal cases – over 800 worldwide – and judges using AI to assist in judicial decision-making, the risks and benefits are impossible to ignore. Matthew Lee examines how different jurisdictions are responding
What has changed, and why? Paul Secher unpacks the new standards aligning the recruiting, training and appraising of judges – the first major change to the system for ten years
The deprivation of liberty is the most significant power the state can exercise. Drawing on frontline experience, Chris Henley KC explains why replacing trial by jury with judge-only trials risks undermining justice
Ever wondered what a pupillage is like at the CPS? This Q and A provides an insight into the training, experience and next steps
The appointments of 96 new King’s Counsel (also known as silk) are announced today