*/
How to cross-examine your Gen AI tools and interrogate the outputs? Sally McLaren’s tips for using AI safely in legal research
Unless you’ve been completely off-grid for the past 12 months or so, you’ve likely encountered the deluge of news, articles, explainers, and enthusiastic LinkedIn posts about the wonders and/or terrors of Generative AI.
If you have been offline and missed it all, then congratulations! It’s been a lot! This bit is for you. The AI savvy/weary may skip ahead:
The term ‘artificial intelligence’ (AI) has been in use since the 1950s and refers to the simulation of human intelligence in machines, enabling them to perform tasks that typically require human-like understanding, reasoning, learning, and problem-solving.
Generative AI (Gen AI) is a type of AI that can create or generate content, such as text, images, or other data, by learning from large datasets and producing novel outputs based on observed patterns. Popular examples include: ChatGPT, Claude and Gemini.
There are probably as many Gen AI evangelists as there are prophets of doom, but in between the two camps is a DMZ populated by many more wary adopters, curious skeptics and AI casuals. It is increasingly unrealistic to think that students, pupils, barristers, or indeed law librarians, won’t be using Gen AI. Quite the opposite. Leveraging these new tools is fast becoming a marketable skill. However, as useful as Gen AI can be, there is a significant degree of risk attached to employing it in your studies and practice.
Here are eight tips to help minimise the risks associated with using Gen AI:
Gen AI is just another tool to be leveraged, albeit carefully. Investing time in mastering this new skill and learning more about risks and effective use is key. Explore a curated list of online courses, many of which are free, here.
Unless you’ve been completely off-grid for the past 12 months or so, you’ve likely encountered the deluge of news, articles, explainers, and enthusiastic LinkedIn posts about the wonders and/or terrors of Generative AI.
If you have been offline and missed it all, then congratulations! It’s been a lot! This bit is for you. The AI savvy/weary may skip ahead:
The term ‘artificial intelligence’ (AI) has been in use since the 1950s and refers to the simulation of human intelligence in machines, enabling them to perform tasks that typically require human-like understanding, reasoning, learning, and problem-solving.
Generative AI (Gen AI) is a type of AI that can create or generate content, such as text, images, or other data, by learning from large datasets and producing novel outputs based on observed patterns. Popular examples include: ChatGPT, Claude and Gemini.
There are probably as many Gen AI evangelists as there are prophets of doom, but in between the two camps is a DMZ populated by many more wary adopters, curious skeptics and AI casuals. It is increasingly unrealistic to think that students, pupils, barristers, or indeed law librarians, won’t be using Gen AI. Quite the opposite. Leveraging these new tools is fast becoming a marketable skill. However, as useful as Gen AI can be, there is a significant degree of risk attached to employing it in your studies and practice.
Here are eight tips to help minimise the risks associated with using Gen AI:
Gen AI is just another tool to be leveraged, albeit carefully. Investing time in mastering this new skill and learning more about risks and effective use is key. Explore a curated list of online courses, many of which are free, here.
How to cross-examine your Gen AI tools and interrogate the outputs? Sally McLaren’s tips for using AI safely in legal research
Chair of the Bar finds common ground on legal services between our two jurisdictions, plus an update on jury trials
A £500 donation from AlphaBiolabs has been made to the leading UK charity tackling international parental child abduction and the movement of children across international borders
Marie Law, Director of Toxicology at AlphaBiolabs, outlines the drug and alcohol testing options available for family law professionals, and how a new, free guide can help identify the most appropriate testing method for each specific case
By Louise Crush of Westgate Wealth Management
Marie Law, Director of Toxicology at AlphaBiolabs, examines the latest ONS data on drug misuse and its implications for toxicology testing in family law cases
An interview with Rob Wagg, CEO of New Park Court Chambers
With at least 31 reports of AI hallucinations in UK legal cases – over 800 worldwide – and judges using AI to assist in judicial decision-making, the risks and benefits are impossible to ignore. Matthew Lee examines how different jurisdictions are responding
What has changed, and why? Paul Secher unpacks the new standards aligning the recruiting, training and appraising of judges – the first major change to the system for ten years
The deprivation of liberty is the most significant power the state can exercise. Drawing on frontline experience, Chris Henley KC explains why replacing trial by jury with judge-only trials risks undermining justice
Ever wondered what a pupillage is like at the CPS? This Q and A provides an insight into the training, experience and next steps
The appointments of 96 new King’s Counsel (also known as silk) are announced today