Site icon baron + associates

Balancing Innovation and Integrity: NSW Chief Justice Restricts AI in Legal Practice

As artificial intelligence becomes more embedded in professional life, the legal sector in New South Wales is taking a deliberately cautious path. Chief Justice Andrew Bell has introduced wide-ranging restrictions on the use of generative AI in court proceedings, aiming to preserve the integrity of the judicial process while acknowledging the utility of the technology in limited contexts. Under the newly revised Practice Note SC Gen 23, which came into effect on 3 February 2025, AI tools like ChatGPT and legal-specific platforms such as Lexis Advance AI can be used for administrative tasks or legal research. However, they are expressly banned from generating or altering affidavits, witness statements, character references, and expert reports without prior leave of the court. These documents must reflect a person’s own knowledge or opinion—not that of a machine. The concern, according to Chief Justice Bell, is twofold: first, that AI’s ability to produce convincing but fictitious content could undermine evidentiary reliability; and second, that overreliance on AI could erode the human judgment essential to democratic justice. Barristers have also warned that AI “hallucinations”—fabricated case citations or arguments—pose a real risk to legal accuracy and accountability. The new guidelines go further by regulating the use of AI for sensitive or confidential material, prohibiting its submission into public large language models unless strict privacy controls are in place. Lawyers must disclose when AI has been used and verify all references manually, reinforcing the profession’s ethical obligations. In short, NSW’s approach is not anti-AI, but firmly pro-accountability. While AI may streamline aspects of legal practice, the courts are clear: the core work of truth-seeking, reasoning, and judgment remains a distinctly human task.

Exit mobile version