Article

In-house lawyers beware: barrister rebuked for relying on hallucinating AI

3 June 2025

A 3d image of representing artificial intelligence

Generative AI is being hyped up as the saviour of all things from the NHS to education to legal services. So, lawyers using artificial intelligence should not come as a surprise.

We previously reported that a large majority of general counsel and senior in-house lawyers are now using some form of AI. The uses are wide and varied, from reviewing documents to researching legal points for court actions.

We also reported on the risks of using AI and in-house lawyers. Just recently a barrister failed to spot that sources they had researched and relied upon in their legal argument were fake. This led to a withering rebuke by the judge.

Mr Ayinde challenged the London Borough of Haringey’s handling of his housing application . His barrister referred to five cases which did not exist. The barrister was not cross-examined so it was not clear how these fake cases were found but the judge speculated they were hallucinated by generative AI. When challenged by the council that these cases were non-existent, Mr Ayinde’s barrister was dismissive of this.

The judge took a different view and was scathing in his criticism. He said the barrister had been improper to put fake cases into a pleading. It was unreasonable for her to claim these fake cases were “minor citation errors” when challenged on their existence. Furthermore, providing a fake description of five fake cases was professional misconduct. The judge said she should have reported herself to the Bar Council. He also said Mr Ayinde’s solicitors who had instructed the barrister should have reported themselves to the Solicitors Regulation Authority for saying the barrister’s blunders were “cosmetic errors”. While the judge could not be sure the barrister had relied upon AI which had hallucinated these fake sources, he said it would have been negligent if she used it without checking the output first.

This is not the first time this has happened. In 2023, a man brought a claim against an airline for personal injury. His legal team submitted a brief that cited several cases which proved to be non-existent. Last year, an attorney in Texas was fined for submitting a court filing with non-existent cases and quotations generated by AI. Earlier this year, a federal judge in Wyoming threatened to sanction lawyers who had used non-existent cases in a claim against Walmart.

While use of AI by in-house lawyers is surging, they should beware they are not immune to hallucinating AI. As these examples show, a failure to use AI properly could lead to a rebuke by a judge, disciplinary action by their employer or even being reported to their regulator for improper, unreasonable or negligent conduct.

None of those outcomes are favourable and there are some simple steps in-house lawyers can take to avoid this:

  1. Verify: Always check AI-generated citations against reliable legal databases and sources. Remember, you should check AI-generated outputs just like you would check work completed by your paralegal.
  2. Policy: Implement appropriate policies regarding use of AI. Be careful which AI tool you use, how you use it and the guardrails you put in place. Consider how to increase accuracy and reduce hallucinations.
  3. Ethics: Ensure you and your legal team are aware of the professional ethics and disciplinary risks if you do not check AI output.
  4. Training: Stay informed about the capabilities and limitations of AI tools and train your team to ensure they use AI effectively. In particular, train your team on what prompts they should enter into your AI tool and train them to check the results.

How can we help you?