Courting Change: Treatment of AI-Generated Content by Australian Courts

Allie Umoff, Stephanie Lo and Suli Jayasekara

With the rise of artificial intelligence (AI) has come uncertainty about the appropriate use of AI in the legal profession. While AI can undoubtedly be a useful tool for lawyers and litigants, the benefits of AI must be weighed carefully against the risks associated with the use of AI.

While Australian courts have thus far provided scant guidance as to the use of AI, the issue has arisen for consideration in the recent decisions of DPP v Khan [2024] ACTSC 19 and Yousef v Eckersley & Anor [2024] QSC 35.  This Insights post analyses these two contrasting decisions and discusses the impact for legal practitioners and parties to court proceedings in Australia.

DPP v Khan [2024] ACTSC 19

In DPP v Khan [2024] ACTSC 19, Mossop J considered a number of character references in the course of sentencing proceedings. The language used in the character reference written by the defendant’s brother prompted his Honour to make enquiries about the use of Large Language Model (LLM) programs such as ChatGPT in the preparation of the character reference. Counsel for the defendant informed the Court that the character references may have been prepared with the assistance of an automated computer translation program but did not involve the use of a LLM.

However, Mossop J found that the content of the character reference was consistent with the use of AI. First, Mossop J found the defendant’s brother’s description of having known the defendant “personally and professionally for an extended period” inconsistent with how one would usually describe their relationship with a sibling (at [40]). Second, his Honour found that the descriptions of the defendant’s “proactive attitude towards cleanliness” and “strong aversion to disorder” suggestive of the use of a LLM in preparing the character reference.

Mossop J voiced his concern that the use of LLMs in preparing character references to be used in sentencing proceedings is inappropriate as it “becomes difficult for the court to work out what, if any, weight can be placed upon the facts and opinions set out in them” (at [43]). His Honour similarly noted that the use of computer-based translations, which often disregard the subtleties of the use of language, is also undesirable in preparing character references.

Importantly, his Honour also recognised a positive duty on counsel appearing at sentencing proceedings to make appropriate inquiries and to inform the Court whether any character references tendered have been prepared with the use of LLMs or any automated computer translation programs (at [43]).

On this basis, his Honour placed little weight on the character reference submitted by the defendant’s brother in comparison to other character references which did not appear to have been prepared with the assistance of a LLM or automated computer translation program (at [44]).

Mossop J’s reasons in Khan suggest that courts may place less weight on AI-generated content than content generated by traditional means and that courts expect candour from legal practitioners as to the use of AI in the preparation of material presented to a court.

Yousef v Eckersley & Anor [2024] QSC 35

A contrasting approach to the use of AI in court proceedings is seen in the recent decision of Wilson J in the Supreme Court of Queensland in Yousef v Eckersley & Anor [2024] QSC 35.

In Yousef the plaintiff suffered physical injuries in a motor vehicle accident and the quantum of damages was in dispute. The plaintiff who was self-represented, used AI in the preparation of his submissions. In the course of his reasons, Wilson J summarised the trial process and acknowledged that “the plaintiff’s submissions have been prepared with the assistance of the artificial intelligence platform of Chat GPT. The plaintiff vouched for the accuracy of his submissions, however, stated that this platform assisted in their organisational structure and added a flourish to his submissions” (at [17]).

In this instance, the Court took no issue with the plaintiff’s use of ChatGPT to prepare his submissions given that the plaintiff was candid about his use of the platform and his reason for doing so and was able to vouch that the final product was accurate.  Yousef demonstrates the importance of candour and taking responsibility for the use of AI in court proceedings, in contrast to the approach taken in Khan.  The different treatment of the AI-generated content in the two cases also demonstrates the importance of context in determining whether it is appropriate or useful to use AI tools in preparing material for use in court.


Email | Linkedin | Print


LK Law Pty Ltd
Level 23, 25 Grenfell Street
Adelaide SA 5000
Visit us | Email us
Telephone: +61 8 8239 4600


33 Black Friars Lane
London EC4V 6EP
United Kingdom
Visit us | Email us
Telephone: +44 20 7400 2180
Back to top