chevron-down Created with Sketch Beta.

Business Law Today

November 2023

Navigating Risk and Rewards: Integrating Generative AI into Modern Legal Practice

Gurinder Sangha

Summary

  • Generative AI is a powerful tool that can be used to improve legal workflows and performance, but it is important to be mindful of the risks.
  • When selecting an AI tool, legal teams should consider the following factors: privacy, use case, and workflow.
  • Customized large language models (LLMs) for legal content are crucial for accuracy and relevance.
  • AI-powered legal tech can enforce best practices, accelerate contract reviews, assess risks, and reduce workloads when implemented effectively.
Navigating Risk and Rewards: Integrating Generative AI into Modern Legal Practice
iStock.com/Nick Wiltgen

Jump to:

Nearly everyone is on the generative AI bandwagon, and understandably so. But we are starting to hit some bumps in the road, rattling confidence and dampening enthusiasm. As legal professionals, we shouldn’t be deterred. Instead, we must take a measured and conscientious approach to tool selection and implementation.

AI’s cracks are showing. Research suggests that ChatGPT’s performance may be declining. AI copyright lawsuits abound. And, of course, we’ve all seen the case of the lawyers who filed a brief with fictitious cases. These events might be disheartening, but all technology takes this journey through the Hype Cycle, a framework created by Gartner for how technology evolves. We must stay the course.

The problem is that while we haven’t necessarily uncovered generative AI’s ideal application, AI-powered products are flooding the market. Some are useful; others not so much. Legal professionals must consider several key factors when selecting an AI-powered legal technology product.

How does it use our data?

Legal practice is deeply rooted in confidential and private information. When exploring a legal tech solution, legal teams must ask these essential questions:

  • How will the platform process and use our information?
  • Is the data retained for training?
  • Who has access to the data?

We recently learned Google inadvertently leaked private Bard conversations to its search results. A situation like this would be catastrophic for lawyers.

Any legal tech solution must be built with privacy in mind from the ground up. Period. Peek behind the curtain to understand the underlying algorithm and its confidentiality implications. Without this insight, you cannot rely on a tool to protect your clients’ information or ensure that your organization is compliant with privacy laws and regulations.

Is it designed for our use case?

The legal field contains countless nuances. You wouldn’t task a person who has no legal background with reviewing contracts. Follow the same principle for AI.

An effective legal tech solution based on a large language model (LLM) requires customization. LLMs trained on general datasets lack the specific knowledge necessary to complete legal tasks effectively and accurately. Without a custom model, the algorithm’s knowledge base is too broad; it can miss critical context, resulting in contracts that do not apply to the circumstance, deviate from established best practices, or are unenforceable.

Legal teams need an LLM trained specifically on legal content; many tools are simply an interface for a general LLM. Once again, legal professionals need insight into a platform’s training data and use cases to understand if it aligns with their needs.

Does it help us improve workflows and performance?

Any legal technology tool must be built with human-centered AI. This design leverages the strengths of human critical thinking, creativity, and empathy while incorporating AI capabilities to automate repetitive tasks and free human bandwidth for high-value tasks.

When evaluating a tool’s fit for your team, consider:

  • What tasks does it accomplish?
  • Do its capabilities address pain points?
  • Does it streamline processes or add additional steps?

Each organization’s needs are different. The right solution will augment your people’s work, not impede it.

Generative AI alone is not enough

Generative AI is a piece of the puzzle, not the solution. The technology works best as part of a robust workflow involving a tech stack of established tools like rule-based AI. This form of AI functions on a set of predetermined rules to make decisions and solve problems, so its results are more predictable than generative AI. Examples of everyday use of rule-based AI include email spam filters, which rely on specific keywords and sender email addresses to flag potential spam, and e-commerce recommendations, which are powered by parameters such as products that are often purchased together.

Ideally, your chosen solution should be accessible from your team’s primary workspace, which for many legal professionals is Microsoft Word. Cumbersome, inconvenient solutions that require application toggling do not foster adoption. And if nobody uses the legal tech, no one benefits.

No matter how powerful AI becomes, technology can’t replace people. Your team possesses originality, experience, situational comprehension, and critical thinking that no machine will ever replicate. Any tool you select should support—not appropriate—legal professionals’ work.

The benefits of AI-powered legal tech

AI can deliver many advantages when leveraged correctly. Some of these benefits include:

  • Enforcement of best practices
    AI makes it possible to universally enforce best practices. Algorithms trained on a company’s or client’s legal playbooks can review contracts and replace language deviations with company-specific standardized definitions, preferred negotiation positions, and best practices, eliminating variations between contract writers and ensuring documents meet expectations.
  • Accelerated review
    Legal teams can leverage AI to review lengthy, complex contracts in just minutes. Automated processes reduce time-consuming manual tasks and errors and streamline redlining, resulting in faster negotiations and approvals.
  • Enhanced risk assessment and mitigation
    AI-powered reviews can assess contract risks and flag problematic language, uncovering potentially overlooked issues and allowing lawyers to prioritize high-urgency tasks.
  • Reduced workloads
    Completing repetitive, low-level tasks drains creativity and creates frustration. Lawyers want to make a positive impact, not push papers. AI platforms can bear some of the administrative burden, decreasing human workloads and empowering legal professionals to spend more time on high-value, rewarding activities like building client relationships.

Legal teams should consider embracing generative AI, as long as they are mindful of the risks. While many people may jump off the AI bandwagon as it bumps along the rough road, riding through its inevitable stops and slow progress ensures a faster arrival at the ultimate destination of mature technology.

    Author