chevron-down Created with Sketch Beta.

Litigation News

Litigation News | 2025

Lawyer Sanctioned for Failure to Catch AI “Hallucination”

Rafael P McLaughlin

Summary

  • Federal court says attorney violated Rule 11’s requirement to verify.
  • The submitted legal brief cited and quoted cases “hallucinated” (fabricated) by artificial intelligence.
  • A lawyer’s use of AI must comply with the fundamental principles of the legal profession, including ethics, accuracy, and client service.
Lawyer Sanctioned for Failure to Catch AI “Hallucination”
Kilito Chan via Getty Images

Jump to:

Yet another attorney has been sanctioned for including in a legal brief case citations and quotations “hallucinated” (fabricated) by artificial intelligence (AI). Gauthier v. Goodyear Tire & Rubber Co. represents another “teaching moment” regarding the ethical pitfalls and professional implications of generative AI in legal practice. This is especially true considering the proliferation of AI within the legal industry. ABA Litigation Section leaders contend that the case is a reminder that a lawyer’s use of AI must comply with the fundamental principles of the legal profession, including ethics, accuracy, and client service.

AI Hallucinations Create an Ethical Duty

The sanctioned lawyer represented the plaintiff in a wrongful termination action against Goodyear in the U.S. District Court for the Eastern District of Texas. Goodyear filed for summary judgment, and the plaintiff’s counsel filed a response in opposition. Goodyear filed a reply in which it observed that the plaintiff’s response cited two cases that do not exist and included multiple quotations that Goodyear could not locate in the cited authority. But the plaintiff’s counsel filed a sur-reply that ignored Goodyear’s concerns about the citations and quotes.

The district court ordered the plaintiff’s counsel to show cause why sanctions should not be imposed for his failure to comply with Federal Rule of Civil Procedure 11(b)(2), which requires an attorney filing a litigation document to certify that “the claims, defenses, and other legal contentions are warranted by existing law.” The court also ordered the plaintiff’s counsel to address the local rule governing the use of AI and a lawyer’s responsibility to “review and verify any computer-generated content to ensure that it complies” with the lawyer’s duty of candor, diligence, and respect to the court. 

At the show cause hearing, a contrite plaintiff’s counsel admitted that he filed the response without reading the cited cases or even confirming their existence. Rather, he used one AI tool to produce the response and relied upon another AI product that reportedly failed to flag the hallucinated citations and quotes. The plaintiff’s counsel also acknowledged that after receiving Goodyear’s reply, he did not initially investigate the claimed nonexistence of the legal authority, but waited until after receiving the court’s order to show cause to personally confirm that the cited authority did not exist. Finding that the plaintiff’s counsel had violated Rule 11 and the applicable local rule, the court ordered him to pay a $2,000 penalty and attend a one-hour CLE on AI. 

AI Does Not Replace a Lawyer’s Independent Legal Judgment

“I think the monetary sanction and the CLE were appropriate considering the opprobrium generated from the case,” says Michael D. Steger, New York, NY, Co-Chair of the Litigation Section’s Solo & Small Firm Committee

However, Deborah Winokur, Philadelphia, PA, Vice-Chair of the Section’s Ethics & Professionalism Committee, believes that ordering one hour of CLE “really missed the mark” given the time it takes to “really learn AI.” 

Winokur adds that for all the focus that cases like Gauthier place on the technological limitations of AI, they highlight “a basic lapse” in “professional competence.” Although hallucinated legal citations are “classified as AI mistakes, they really [involve whether] the lawyer [did] what they were supposed to do in terms of verifying the cases, using reliable sources to check the citations.”

Gauthier’s “takeaway is pretty simple—you need to check the sources you’re citing,” Steger offers. AI can be “very helpful for initial research” because it allows lawyers to “digest a large number of cases” and focus on “what should be appropriate authority,” he says. But it remains “incumbent upon the lawyer to drill down into that authority [to] make sure it actually exists, and it actually supports the proposition for which it is actually being cited,” notes Steger.

AI Can Provide “Real Benefits”

Notwithstanding the pitfalls of AI that cases like Gauthier have recently magnified, Section leaders say that lawyers should not be scared off from this technology and the merits of incorporating it into their practice. Indeed, Model Rule of Professional Conduct 1.1 requires lawyers to “keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology.” The ABA has also issued guidelines on the ethical use of AI, highlighting the need for competency, informed consent, confidentiality, and reasonable fees.

Lawyers “can’t just dunk [their] heads in the sand and ignore” AI as too risky “because there could be real benefits to clients through different applications,” Winokur says. Winokur adds that “lawyers should treat AI as a tool” that can support their briefing. But lawyers should “lean into their practice experience” and “the arguments [they have] learned and perfected over the years” and not let AI replace their judgment and expertise, she notes.

Resources

    Author