|
Subscribe / Renew |
|
|
Contact Us |
|
| ► Subscribe to our Free Weekly Newsletter | |
| home | Welcome, sign in or click here to subscribe. | login |
| |
February 27, 2026
Brown
|
Today more than ever before, artificial intelligence platforms are changing the way we work. This is especially true for work that generates content and documents. Generative AI tools, spearheaded by overnight household names like ChatGPT, Copilot, Gemini and Claude, have begun to rapidly alter workflows in these sorts of jobs. Although people often think of construction as generative in a different sense, construction projects generate a huge amount of content and documentation. For this reason, the construction industry — and the lawyers who serve it — are beginning to incorporate AI. It behooves us to understand what this means and what we need to watch for.
AI tools are being marketed to those working at every stage of construction projects. Lawyers negotiating and drafting contracts are tantalized by AI offerings from Westlaw, Lexis, Harvey and others, which can often be trained on their law firm's particular clauses, templates and prior work. While this can make drafting more efficient, it threatens to minimize or eliminate the creative aspects of the process that come from wrangling with words and ideas in the context of a particular negotiation. It also introduces new potential for errors. We construction lawyers are grappling with these issues now and slowly beginning to incorporate AI.
AI tools are also being marketed to contractors, design professionals and other construction project participants as a way to manage and interact with drawings, specifications and other contract documents, as well as submittals and communications, during a complex project. One of the most difficult parts of large or complicated construction projects is ensuring efficient and effective communication of plan and project information and contractual requirements, and there are now numerous AI platforms geared toward processing project files and providing interactive answers to questions about them. In some ways, it's the logical extension of the building information modeling (BIM) push of the last two decades. If done correctly, this can help participants understand project requirements without resort to cumbersome submittal processes or scheduled meetings. However, this efficiency comes at the cost of processes that have become integral to contemporary construction.
Allocations of legal risk are woven into the fabric of construction projects. Contractors typically present their questions regarding plans and designs to design professionals (who usually contract with the project owner), who in turn provide responses and, sometimes, alterations or clarifications. The “request for information” (RFI) process allows the designers to address issues with the plans, the owner to be kept apprised, and the contractor to receive the answers it needs to move forward. If the contractor instead simply queries an AI tool to get answers, the designers cannot weigh in, the owner may have no idea there is any confusion, and the contractor moves forward at its own risk. Replacing the traditional process with an AI-centered process would require significant reworking of the risk allocation via contract.
|
AI adoption can also increase the risk of error since generative AI tools, at least at this stage of the technology, are known to “hallucinate” — to provide plausible-sounding answers that are actually incorrect. This doesn't always involve wholesale fabrication and can be difficult to detect. As an experiment, I asked a generative AI tool to simply restate a sentence 10 times without alteration. Each “restatement” was very slightly different, with transposed words or altered punctuation. At a quick glance, the sentences looked the same, until I read them closely. In a recent case with more serious consequences, a law firm was sanctioned after it asked a generative AI tool to simply “reformat” citations. Rather than moving the pieces of the citations around the way a human editor would, the AI generated new, correctly formatted citations but made certain components up entirely. Like the “restatements” in my test, the results were perhaps convincing at a glance but in fact included hallucinations. In both cases, the AI was not just reusing what was given to it, as a human would, but generating entirely new output. This creates opportunities for hallucination.
This can be a problem when asking a generative AI tool to evaluate contract documents or plans and determine what they require. When we read contracts, we consider what the contract says and means. When an AI reads a contract, it considers what the contract says and how it should generate new information in response to the user's prompt. Many AI tools even try to “please” the user with the response. This creates the potential for significant risk when used, for example, to answer project questions outside of a formal RFI process.
Perhaps due to this risk, AI is being adopted in construction more slowly than in some fields. For now, nearly all construction contracts continue to assume that the typical processes will continue. But over time, as AI technology improves and capacity for data generation on construction projects increases further, it is likely that AI tools will come to play a more significant role on projects. “Prompt engineering” — the art and science of giving directions to AI tools to prompt generation of better answers — likely will become an important skill. Incorporation of AI tools in a meaningful way will require new contract terms and reallocation of risks. Construction industry players would do well to get ahead of the trend and begin exploring the benefits and risks of AI technology, and its applications and limitations, now.
Evan A. Brown is an attorney at Stoel Rives, who represents clients in the construction industry in a variety of litigation, transactional, and alternative dispute resolution matters. He can be contacted at evan.brown@stoel.com.