AI-hallucinated cases end up in more court filings, and Butler Snow ...
Butler Snow is asking a federal judge to exempt its government client from sanctions for the law firm's use of incorrect citations generated by artificial intelligence in two documents filed with the court.
"There are no excuses for counsel’s behavior, only explanations," the firm said in a May 19 response to an order to show cause issued by U.S. District Judge Anna M. Manasco of the Northern District of Alabama.
AI Generated Citations
A Butler Snow partner used ChatGPT to find cases supporting what he thought was a well-established legal position, the response said. He inserted the citations into a draft brief without verifying their accuracy. "These citations were 'hallucinated' by ChatGPT in that they either do not exist and/or do not stand for the proposition for which they are cited," the response said.
Publications with coverage include the Alabama Reflector, Reuters, AL.com and the Guardian.
Other AI Hallucination Cases
Butler Snow is one of several larger firms dealing with AI hallucinations, Reuters reports. In one instance, Latham & Watkins admitted including a footnote in an expert report that cited and linked to a real article but with an AI-generated fake title and incorrect authors.
In another case, K&L Gates and a second firm were sanctioned $31,100 for submitting a brief with "bogus AI research." A database compiled by an academic in Paris lists 106 cases worldwide in which AI hallucinations ended up in court documents, according to the Guardian.
Response and Apology
The Butler Snow partner who took responsibility is identified as Matthew Reeves. He did not immediately reply to an ABA Journal request for comment made by email and voicemail. Reeves and other Butler Snow lawyers are defending the commissioner of the Alabama Department of Corrections in an inmate lawsuit alleging that he was not protected from repeated stabbings.
Butler Snow issued an apology, stating, "Butler Snow is embarrassed by what happened here, which was against good judgment and firm policy. There is no excuse for using ChatGPT to obtain legal authority and failing to verify the sources it provided, even if to support well-founded principles of law."
The firm’s chair, Christopher Maddux, did not immediately respond to a Journal email seeking comment.




















