People are now using artificial intelligence (AI) for everyday tasks such as creating videos, writing content and doing research.
While AI can be helpful, there are growing concerns about accuracy, reliability, and unintended errors — particularly when AI is used for legal or legislative matters.
The BCCM Office has recently seen:
- applications containing incorrect or irrelevant legal references
- citations to legislation or decisions that do not exist
- written enquiries quoting provisions that are not applicable.
This has raised concerns about people relying on AI without checking the information.
Relying on AI alone when preparing a dispute application can have serious consequences. If an adjudicator decides there is no legal basis for the application and considers it frivolous or vexatious, the applicant may be ordered to pay costs.
How AI works
Today, AI can learn tasks, make decisions, and generate content based on instructions from users. It is widely used to automate work, create written material, and analyse information.
Artificial content generators:
- produce written responses based on user instructions
- predict likely words by analysing large amounts of online data
- draw from sources such as websites, news articles, or past decisions.
However, AI does not understand the meaning and cannot tell what is true or false.
Limitations of AI
Even though AI has access to a lot of information, it has limits. It is not very creative and can struggle to understand details or special situations. Often, it repeats information instead of creating something new.
The other important factor is that AI can produce incorrect or biased content. This is called hallucinating. AI can only use the data it has learned from and has access to. Missing or biased data can lead to mistakes, especially when AI invents things to try to match the instructions it is given.
Some AI programs provide citations for the website or documents that it draws information from. Being mindful of the quality of these sources helps users verify whether the information is accurate.
Risks of relying on AI for legal information
A recent adjudicator’s order has highlighted potential consequences when incorrect AI-generated information is used in an application.
In Sky Gardens [2025] QBCCMCmr 373 (29 October 2025), the applicant submitted that they suffered damage to their lot due to a blocked or damaged drainage pipe from the air conditioning unit within a wall. The applicant acknowledged that the pipe only serviced their lot but asserted that due to its location within a wall cavity, it was not their responsibility to maintain. The body corporate disagreed and said the pipe was the owner’s responsibility.
When reviewing the application, the adjudicator found serious problems with the information provided. This included:
- legal provisions that were incorrect or not relevant
- case decisions that did not exist or had nothing to do with the dispute
- claims and reports that were not supported by evidence.
The adjudicator noted that some of the incorrect references suggested artificial intelligence may have been used to prepare the application, as AI tools are known to sometimes generate false case citations.
The adjudicator clarified the rules surrounding maintenance of utility infrastructure under both section 20 of the Body Corporate and Community Management Act 1997 (the Act), and section 170 of the Body Corporate and Community Management (Accommodation Module) Regulation 2020.
The owner argued that the leak originated from a structural or installation defect, and not owner misuse or poor maintenance. They also did not explain how this would alter the maintenance obligations provided within the legislation. Further, they did not demonstrate that the pipe was damaged by the body corporate or because of something it was responsible for.
The adjudicator described the application as confused, repetitive, and unsupported by evidence. Even after being given the chance to provide correct case references, the applicant could not show how they were relevant.
The adjudicator also stated that if AI or other sources were used, the applicant had not properly checked the accuracy of the information.
In the end, the adjudicator found the application was misconceived and without substance. The applicant was ordered to pay the body corporate $2,000 in costs.
Financial implications
When seeking a legally binding order, it is important that all information is correct. As the previous example shows, incorrect information can have financial consequences.
Under Section 270 of the Act, if an adjudicator orders that an application is dismissed for being frivolous, vexatious, misconceived or without substance, they can also order that the applicant must pay costs of up to $2000 to another party or parties to the application.
The onus is on the applicant to ensure that they do not provide false or misleading information or documents in a dispute resolution application process. If they do, under sections 297 and 298 of the Act a fine can potentially be applied by the Magistrate’s Court of up to $10,014.
Your responsibility
While this is the first incident relating to AI use in our office, stories of AI issues are commonplace in other jurisdictions. These issues are so common that lawyers have been stripped of the ability to practice after using false citations, and tribunals and courts in Queensland now have new practice directions requiring the solicitor or barrister be named in submissions to ensure the accuracy of the information cited.
Also, due to the increase in the use of content generating AI, Queensland Courts have also provided a guide for non-lawyers – The Use of Generative Artificial Intelligence (AI) Guidelines for Responsible Use by Non-Lawyers.
While AI can respond to prompts with what appears to be coherent legal advice, there is no safeguard against it making up material. Anyone who is researching information or writing an application needs to be sure they are using the correct prompts to ensure that the content generator will not create hallucinations or add in irrelevant information.
You can easily verify body corporate information and sections of the legislation with our information and community education unit. Anyone who needs general information can ask questions about the legislation in our free call back service on 1800 060 119 or write to us by making an online enquiry.
This post appears in Strata News #789.
Commissioner for Body Corporate and Community Management P: Information Service Freecall 1800 060 119
