More people are using AI in court, not a lawyer. It could cost you money – and your case
- Written by The Conversation
When you don’t have the money for a lawyer to represent you in a court case, even judges can understand the temptation to get free help from anywhere – including tapping into generative artificial intelligence (AI).
As Judge My Anh Tran in the County Court of Victoria said this year:
Generative AI can be beguiling, particularly when the task of representing yourself seems overwhelming. However, a litigant runs the risk that their case will be damaged, rather than helped, if they choose to use AI without taking the time to understand what it produces, and to confirm that it is both legally and factually accurate.
Our research has so far found 84 reported cases of generative AI use in Australian courts since ChatGPT launched in late 2022. While cases involving lawyers have had the most media attention, we found more than three-quarters of those cases (66 of 84) involved people representing themselves, known as “self-represented litigants”.
Those people – who sometimes have valid legal claims – are increasingly turning to different generative AI tools to help on everything from property and will disputes, to employment, bankruptcy, defamation, and migration cases.
Our ongoing research is part of an upcoming report for the Australian Academy of Law, being launched later in the year. But we’re sharing our findings now because this is a growing real-world problem.
Just this month, Queensland’s courts issued updated guidance for self-represented litigants, warning using “inaccurate AI-generated information in court” could cause delays, or worse: “a costs order may be made against you”.
As New South Wales Chief Justice Andrew Bell observed in a decision in August this year, the self-represented respondent was “admirably candid with the court in relation to her use of AI”. But while she was “doing her best to defend her interests”, her AI-generated submissions were often “misconceived, unhelpful and irrelevant”.
If you’re considering using AI in your own case, here’s what you need to know.
The temptation to rely on AI
Self-representation in Australian courts is more common than many people realise.
For example, 79% of litigants in migration matters at the Federal Circuit Court were unrepresented in 2023-2024.
The Queensland District Court has said “a significant number of civil proceedings involve self-represented parties”. The County Court of Victoria last year created easy-to-use forms for self-represented litigants.
But as the availability of free or low-cost generative AI tools increases, so does the temptation to use AI, as our recent research paper highlighted.
The risks if AI gets it wrong
Relying on AI tools that produce fake law can result in court documents being rejected, and valid claims being lost in court.
If you’re a self-represented litigant, the court system gives you the right to provide evidence and argument to support your case. But if that evidence or argument is not real, the court must reject it. That means you could lose your day in court.
In those circumstances, the court may make a costs order against a self represented litigant – meaning you could end up having to pay your opponent’s legal costs.
Lawyers here and overseas have also been caught relying on inaccurate AI-generated law in court.
But a key difference is that if a lawyer uses fake cases that the court rejects, this is likely to amount to negligence. Their client might be able to sue the lawyer.
When someone representing themselves makes the error, they only have themselves to blame.
How can you reduce your risks?
The safest advice is to avoid AI for legal research.
There are many free, publicly available legal research websites for Australian law. The best known is the Australasian Legal Information Institute (AUSTLII). Another is Jade.
Court libraries and law schools are open to the public and have online resources about how to conduct legal research. Libraries will often have textbooks that set out principles of law.
Australian courts, such as the Supreme Court of Queensland, Supreme Court of NSW and Supreme Court of Victoria, have all issued guidance on when generative AI can and cannot be used.
Check if there’s a guide from the relevant court for your case. Follow their advice.
If you still plan to use generative AI, you must check everything against a reliable source. You need to search for each case you plan to cite, not just to make sure it exists, but also that it says what an AI summary says it does.
And as Queensland’s guide for self-litigants warns:
Do not enter any private, confidential, suppressed or legally privileged information into a Generative AI chatbot […] Anything you put into a Generative AI chatbot could become publicly known. This could result in you unintentionally breaching suppression orders, or accidentally disclosing your own or someone else’s private or confidential information.
Conducting legal research and producing court documents is not easy. That’s what trained lawyers are for, which is why affordable, accessible legal services are necessary for a fair justice system.
AI is being used to address an access to justice problem that it is not well-suited to – at least, not yet.
Thanks to Selena Shannon from UNSW’s Centre for the Future of the Legal Profession, who also contributed to this article.