FeaturedGovernanceLegal
Artificial Intelligence and Condominium Bylaws

Article submitted by Michael Gibson
Miller Thomson Law

In June 2023, two New York lawyers were sanctioned for using fake cases in legal brief that were fabricated by an AI program the lawyers used to help draft their briefs of argument.  In February 2024, a Vancouver lawyer was sanctioned with Court costs for citing non-existent cases in a family law matter.  

Artificial Intelligence (AI) tools are taking on an increasing role in our daily lives.  In the professional context, the reliance on AI can be dangerous, as the cases of the New York and Vancouver lawyers shows.  As a tool, AI has many benefits and many draw backs.  It can assist with sorting through and organizing vast amounts of data that no human could possibly do in a lifetime.  That said, AI is not capable of understanding or critically questioning the data it compiles and provides, so the human touch will still very much be needed.

With the increasing use of AI in both personal and professional settings, unit owners and condominium boards may be tempted to use AI to put together meeting minutes, AGM packages, and even for drafting new bylaws.  The dangers and pitfalls of relying AI tools for professional and legal documents are lessons tragically learned by the lawyers in the above-cited cases.  Condominium boards should take note to ensure that they do not make similar mistakes, to the detriment of their owners.

What can go wrong for boards?  How bad can AI bylaws or minutes be? 

To answer these questions, we should first start with the basics of how such AI programs operate. 

AI chat programs are generally a type of AI computer program based on large language models (or LLMs).  These models are a kind of online tool that can produce coherent, comprehensible responses to pretty much any prompt, almost as if a human typed the response.  The AI program achieves this by studying and compiling examples of written texts and behaviours across the entirety of the internet to produce a fitting result for whatever prompt was fed into the program.  In this way, AI programs actively read and attempt to understand user prompts, in order to produce a sensical response.

By sifting through all of the text on the internet (a disorganized web of data initially), AI programs use the prompt inputted by a person and produce an approximation out of all of that web-based text data that is grammatically coherent and readable by a human.  The problem with such systems is that they are prone to “hallucinations,” or non-sensical responses to factual questions.  In short, AI can produce a readable product based on a user input, but the chances of that product being factually or legally accurate in any meaningful way are not good.  Often times, it will be a grammatically readable amalgamation of unrelated and potentially contradictory or factually incorrect information that is not usable for any real-world setting.  

In this regard, a recent study on AI tools in the legal industry found that AI programs produce these types of hallucinations in up to 88% of results. For lawyers, these hallucinations produced by AI lead to citations of fake cases.  For condominium boards, they can lead to factual errors in minutes, erroneous references to non-existent past decisions, and can be especially damaging in bylaws, which may wind up including illegal provisions that are unenforceable. 

Take, for example, a potential set of condominium bylaws drafted by AI.  Because AI programs work by examining the entirety of the internet for relevant content, and reorganizing that data into a reasonably fitting response to the prompt inputted, there are no practical or territorial limits to the information that an AI program might compile and then incorporate into the draft bylaws.  

In this example, if a condominium board inputted “draft new condo bylaws for an Alberta condo” into an AI program, the program would look worldwide for relevant information about condo bylaws and would produce a set of bylaws.  Even if the program focused on Alberta, it would generally be expect to incorporate some wacky results.  The problem, as we have seen in our office, is that these results are anything but workable, and often includes:

  1. Restrictions and prohibitions contrary to Alberta law;
  2. Erroneous provisions that make no sense in our legal landscape; 
  3. Erroneous reference to non-existent courts or tribunals;
  4. Restrictions and prohibitions contrary to Alberta law;
  5. Erroneous provisions that make no sense in our legal landscape; 
  6. Erroneous reference to non-existent courts or tribunals;

These types of issues create serious problems for condominium boards.  Obviously, no condominium corporation would want bylaw provisions that are blatantly offside the law, impractical, erroneous, unnecessary, and just plain unworkable.  AI-drafted condominium bylaws are all of these things because they come from an amalgam of a number of different template documents and other source material found by the AI program, with provisions clearly taken from a host of irrelevant sources, including from other provinces and countries even.  

The fundamental problem

The fundamental problem is that AI programs produce a response based on a non-specific data set covering the entire internet, in an attempt to exclude irrelevant data and reconstruct what is left over into a comprehensible response, significant portions of that response can be expected to be entirely fabricated, crafted by the AI program to fit the context of the task.  The result for documents like condominium bylaws will be almost entirely unusable.  

Bylaws are specific to each condominium corporation in Alberta.  Every condominium is different and your bylaws will need to be compliant with the applicable laws as well as setting out workable rules for your community that are enforceable and practical.  Having bylaws that require compliance with foreign rules, reference non-existent amenities, or go on at length on hypothetical issues about the use of radioactive materials or the operation of brothels and the use of drugs in condominiums (yes, we have seen all of these from AI bylaws) will create significant administrative and legal challenges for your condominium.

The essential point is that AI does not know your community nor does it care about your community’s needs.  By obtaining proper, professional service early on in the process, you will be able to develop a set of Bylaws that are specific to your condominium, compliant with applicable laws, and practically usable.  AI will give you a set of “bylaws,” but they will probably not be the bylaws you want or need. 

 See:

  1. https://www.reuters.com/legal/new-york-lawyers-sanctioned-using-fake-chatgpt-cases-legal-brief-2023-06-22/
  2. https://www.canlii.org/en/bc/bcsc/doc/2024/2024bcsc285/2024bcsc285.html?autocompleteStr=2024%20BCSC%20285&autocompletePos=1&resultId=becbf075cf1442c4be9164678a0a1832&searchId=2024-03-08T12:54:08:356/275406bf8d25441e852b7e19c7771dd0
  3.  Matthew Dahl et. al., “Large Legal Fictions: Profiling Legal Hallucinations in Large Language Models” (2024) arxIV:2401.01301