AI Misstep: Melbourne Lawyer's Case Citations Controversy

Published On Thu Oct 10 2024
AI Misstep: Melbourne Lawyer's Case Citations Controversy

Melbourne lawyer referred to complaints body after AI generated false case citations

A Melbourne lawyer has been referred to the Victorian Legal Services Board and Commissioner for investigation after using artificial intelligence software that generated false case citations in a family court case, leading to a hearing being adjourned.

Case Background

In a hearing on 19th July 2024, an anonymous solicitor representing a husband in a dispute provided the court with a list of prior cases requested by Justice Amanda Humphreys for an enforcement application. Upon returning to her chambers, Justice Humphreys and her associates were unable to identify the cases listed. The lawyer later admitted that the list was prepared using the legal software Leap, which had an AI component, and he had not verified the accuracy of the information before submission.

Legal Proceedings

In a ruling by Justice Humphreys, the lawyer was given a month to respond to the reasons why he should not be investigated. In a subsequent ruling from August, Humphreys referred the solicitor for investigation, stating that the AI-generated list was not reviewed by anyone and the cases were fictitious.

Despite offering an apology and expressing a willingness to learn from the incident, the lawyer was still referred for investigation due to public interest in examining professional conduct issues related to the increasing use of AI tools in the legal field.

Importance of Verification

The incident highlighted the importance of verifying AI-generated work before submission. While Leap software offers a verification process through human review, the lawyer in this case did not utilize this service effectively, leading to the erroneous submission of false case citations.

The Repressive Power of Artificial Intelligence | Freedom House

According to Leap, 66,000 legal professionals globally use their software, which aims to streamline legal processes and enhance efficiency. The company emphasized the need for users to understand the workings and limitations of AI tools to ensure accurate and reliable outcomes.

Lessons Learned

This case serves as a reminder of the ethical obligations of legal professionals when using AI tools in legal practice. While AI can be a powerful asset, it must be employed responsibly to avoid misinformation and uphold the integrity of the legal system.

Generative AI Critical Analysis Activities for Educators and ...

Other incidents globally have also highlighted the risks associated with relying solely on AI-generated information in legal proceedings, emphasizing the need for oversight and diligence in utilizing technological advancements in the legal field.

For more information on similar cases involving AI tools in the legal industry, you can read the full articles on Crikey and The Guardian.