The Fair Work Commission has released draft rules to govern how workers, employers and lawyers use generative AI in cases, including mandatory disclosure, accuracy checks and special safeguards for witness statements, and is now calling for public feedback
The Fair Work Commission has moved to formally regulate the use of generative AI in cases before it, releasing an exposure draft Guidance Note that would require parties to disclose when they have used tools such as ChatGPT, Claude, Copilot or Gemini to prepare documents.
In a statement issued on 24 March 2026, Commission President Justice Hatcher said the proposed Guidance Note: Use of Generative Artificial Intelligence in Commission cases is part of a package of reforms aimed at managing an “unprecedented GenAI‑driven increase” in the tribunal’s workload.
By the end of the 2025/26 financial year, the Commission expects its total workload to have grown by more than 70% in just three years. Over the same period, it has observed widespread use of AI‑generated language in applications, particularly unfair dismissal and general protections matters, and a breakdown in the historic link between dismissal‑related lodgments and labour‑market conditions.
Justice Hatcher said it can “reasonably be inferred” that much of this growth is being driven by the increasing use of generative AI tools by potential litigants.
Three new requirements when GenAI is used
The exposure draft Guidance Note sets out three core obligations that would apply whenever generative AI is used to prepare an application or any other document to be lodged in a Commission case.
Requirement 1: Disclose AI use
Parties who use GenAI in writing, creating, modifying or otherwise preparing any document for lodgment would be required to explicitly state in that document that GenAI was used.
The Commission stresses that disclosure, in itself, will not affect the merits of a party’s case, provided they comply with the other requirements. However, failing to disclose AI use could result in a document being given reduced weight, disregarded altogether, costs orders, or even dismissal of the case.
Commission forms and templates will be updated to include a dedicated “Use of GenAI” section, allowing users to meet this disclosure obligation simply by completing the relevant part of the form. An example of the proposed section has been published alongside the draft Guidance Note.
Requirement 2: Verify every detail
Anyone using GenAI to prepare a document must carefully check and, where necessary, correct the content before lodging it, then state in the document that this checking has been done.
In particular, parties must ensure that:
-
All references to facts or evidence are correct and that the facts or evidence actually exist
-
All cases, legislation, textbooks and articles referred to exist and support the legal propositions attributed to them
-
All extracts and quotations are accurate and properly attributed to the correct source
The draft Guidance Note makes clear that GenAI itself cannot be used to verify the accuracy of content. Instead, parties must rely on authoritative sources, such as Commission Benchbooks and case law, the Commission’s decisions database, AustLII for court judgments, and the Federal Register of Legislation for statutes and regulations.
The Guidance Note also reminds users that knowingly providing false or misleading information to the Commission is a criminal offence carrying a potential penalty of up to 12 months’ imprisonment under section 137.1 of the Criminal Code.
Requirement 3: Extra safeguards for witness statements
Where GenAI is used in connection with a witness statement or declaration, additional obligations would apply.
The Commission “recommends” that generative AI not be used to create the substantive content of witness statements or declarations at all, stressing that these documents must reflect a witness’s own knowledge rather than AI‑generated material.
If GenAI is used to edit, modify or otherwise prepare a statement or declaration, the witness or declarant must:
-
Check the document and make any necessary changes so that it is based on their own knowledge and is true to the best of their knowledge
-
Explicitly declare in the document that it is based on their own knowledge and is true to the best of their knowledge
False or misleading sworn or affirmed evidence before the Commission is also an offence, punishable by up to 12 months’ imprisonment under section 678 of the Fair Work Act 2009.
Concerns about “hallucinations”, privacy and bias
The draft Guidance Note sets out in detail why the Commission is moving to formalise expectations around AI use.
While acknowledging that GenAI can improve efficiency and enhance access to justice for self‑represented parties by helping them to draft clearer submissions, the Commission highlights a range of risks:
-
AI‑generated material may be inaccurate, incomplete, out of date or entirely fabricated
-
Tools may invent (“hallucinate”) case law, legislation, citations and quotations, or misstate the effect of authorities
-
Outputs may apply laws from the wrong jurisdiction, rely on superseded law, or fail to recognise that decisions have been overturned
-
Models cannot “understand” the unique factual, cultural, emotional or broader social and legal context of a particular case and may reproduce biased or unreliable source material
The Commission also flags serious privacy and confidentiality concerns. Many public GenAI tools collect and retain user prompts, documents and uploads, which may then be incorporated into training data.
The draft Guidance Note warns that:
-
Personal, confidential or legally privileged information provided to public GenAI could become publicly known
-
Supplying information that is subject to a non‑publication order to GenAI could breach that order and constitute an offence under the Fair Work Act
-
GenAI outputs may contain plagiarised content or material infringing others’ intellectual property
To manage these risks, the Commission recommends that parties:
-
Do not provide personal information about any person involved in a case, or confidential information about the matter, to public GenAI tools or to any AI system that may not keep the information secure
-
Avoid giving GenAI names or other details that could be used to identify individuals
-
Understand the privacy policies of any GenAI tool they use, as well as their own legal obligations regarding confidentiality
Higher bar for lawyers and paid agents
The Commission expects legal practitioners to meet a “more onerous” standard in their use of GenAI.
In addition to complying with the general requirements, lawyers must:
-
Maintain high ethical standards and comply with their existing professional obligations in all AI use before the Commission
-
Include hyperlinks to all case law cited in any document prepared with the assistance of GenAI
Paid agents who charge fees for representing parties will also be required to insert hyperlinks to any case law referred to in AI‑assisted documents.
Justice Hatcher’s statement notes that this hyperlink requirement is currently confined to practitioners and paid agents because of concerns that self‑represented individuals may find it difficult to include links in their documents. However, the Commission is actively seeking feedback on whether the requirement should instead apply to everyone who uses GenAI in preparing documents for lodgment.
Forms to carry new “Use of GenAI” section
The Commission has released an example “Use of GenAI” section that it proposes to add to all relevant forms, including application forms and templates such as outlines of submissions and witness statement formats.
Under the draft design, applicants would:
-
Indicate whether they used GenAI in preparing the form
-
If they did, tick a series of declarations confirming they have checked all references to facts and evidence, all cited authorities, and all extracts and quotations, and that all remaining details are correct and relevant
The example section warns that failure to meet these obligations may result in dismissal of an application or costs orders.
Opportunity to comment
The Commission is inviting submissions on the exposure draft Guidance Note, including on whether hyperlinking obligations should extend beyond legal practitioners and paid agents to all users of GenAI in Commission proceedings.
Comments must be lodged by email to [email protected] by 4:00 pm (AEST) on Friday 10 April 2026.
Justice Hatcher emphasised that the GenAI Guidance Note is intended to be a living document and will be updated over time as technology and practice evolve, but that immediate procedural reforms are necessary to address the impact of AI‑driven growth in the Commission’s caseload.