The legal industry isn’t immune to the magnetic powers of AI.
When a Sydney lawyer submitted a court document generated by ChatGPT containing entirely fictitious citations, the response was immediate. The Supreme Court of New South Wales swiftly introduced a formal Practice Note on the use of AI tools in court and court-related legal documents.
This blog covers:
- the court’s response to AI in the courtroom;
- how Lawpoint uses artificial intelligence;
- what every business can learn from the legal profession’s approach.
AI in the courtroom
In February 2025, the Supreme Court of New South Wales issued a Practice Note—that is, a formal document that outlines how lawyers must conduct themselves in certain areas of legal practice. In this case, it set out strict rules on how and when generative AI systems can be used in court proceedings.
Generative AI can’t be used to prepare documents that are intended to reflect a person’s evidence and/or opinion, or other material tendered in evidence or used in cross examination.
This includes:
- Affidavits – for example, a sworn statement detailing a version of events;
- Witness statements – outlining what a person saw, heard, or experienced;
- Expert reports – such as those written by a psychologist;
- Character references;
- Any material tendered in evidence or used in cross-examination.
These uses are not allowed because AI cannot be used to substitute the evidence or opinion of a person, which must be based on that person’s actual knowledge or opinion.
AI models can be used for administrative tasks. These include:
- Searching documents – for example, scanning a large file to find a date of birth or case number;
- Creating chronologies – building a timeline of key events (e.g. marriage, separation, court dates);
- Indexing material – organising large volumes of documents into a structured format;
- Generating witness lists;
- Generating precedents that can be used in transactional legal tasks.
Even when AI is used in an administrative capacity, lawyers must still:
- Verify the accuracy of any AI-generated content;
- Disclose that AI was used and to what capacity; and
- Take full responsibility for the information submitted.
Failure to do so can have very serious consequences, including costs orders or findings of professional misconduct.
Most law firms offer transactional legal services that do not involve courtrooms. This is where AI can have potential benefit, including cost benefits to clients.

AI in daily legal practice at Lawpoint
Lawpoint welcomes the use of artificial intelligence tools that:
- comply with the Law Society of NSW’s guidelines and the Practice Notes issued by various courts;
- improve efficiency; and
- meet the highest ethical standards – particularly around privacy and confidentiality.
However, we do not use AI to replace the specialised skill of our lawyers.
Like many law firms, we use a cloud-based practice and document management software to streamline our practice. Like other cloud based products, ours is regularly offering new AI-driven features. For example, it has an AI-powered search function that helps us quickly locate key details in large files—for instance, a critical reference buried in a 900-page document.
This helps us work faster, reduce repetitive manual tasks, and keep client costs down.
However, what’s most important about our software is that it does not learn from, train its systems using, or share client data outside our firm. Client confidentiality is maintained at all times.
We, of course, also use the AI features introduced to software like Office 365. But that is where our AI use currently ends.
The use of AI programs carries inherent risks and often these risks are unknown to users, with important details buried deep in the fine print that users rarely read.
Take Grammarly – one of the most popular tools for writing. Grammarly is an AI program that integrates with and gains access to other programs, such as Outlook and Word.
Our Principal, Romeo El Daghl, reviewed the website to see what information it captures. Examples of information that Grammarly captures and stores includes:
- the content of what is written in an email;
- the subject line in an email; and
- the names and email addresses of the email recipients.
We doubt that most users of Grammarly would be aware of the extent of the information being captured. We take our client’s confidential information very seriously and as a result, Grammarly is not permitted in our firm.
While we see the potential for AI in the legal profession, AI comes with limitations:
- it isn’t a substitute for critical thinking or professional judgment;
- it can’t capture the subtle nuances of a client’s case;
- it can’t consider the human impact of a legal decision; and
- it can’t weigh up the ethical considerations that are fundamental to our work.
We are open to using AI systems that boost our efficiency and allow us to spend more time where it will make the biggest difference. We continue to monitor the market for new tools, such as the new AI legal research tool Harvey AI.

What other businesses can learn from this
The lessons from the legal profession are applicable to any business exploring the use of AI. While the potential benefits of AI are clear, the risks of over-reliance or misuse are equally significant.
A recent KPMG study found that only 30% of Australians believe the benefits of AI outweigh its risks. This is the lowest confidence level globally, and it reflects a broader hesitation around data security, accuracy, and accountability.
The Australian Government has proposed mandatory AI “guardrails” for high-risk applications of AI, including a requirement to disclose the use of AI to customers. However, until formal regulation is introduced, businesses remain responsible for their own governance and risk management practices.
We recommend any organisation using AI ask the following questions:
- Is AI being used to replace professional judgment or support it?
- Is sensitive data being shared with external systems and how is that data being used and stored?
- Are outputs being verified by a qualified individual?
- Could a mistake damage your reputation or expose you to liability?
- Is your use of AI compliant with any standards or rules applicable to your industry or profession?
We can help
If you need help developing AI policies, risk assessments, or just need strategic legal advice, our commercial lawyers can help. Contact us today.