chevron-down Created with Sketch Beta.

ARTICLE

Second Circuit Court of Appeals Refers Attorney Who Cited Nonexistent Case Provided by ChatGPT for Discipline Investigation

Alan S Wernick

Second Circuit Court of Appeals Refers Attorney Who Cited Nonexistent Case Provided by ChatGPT for Discipline Investigation
Laurence Dutton via Getty Images

In its January 30, 2024, decision in Park v. Kim, the U.S. Court of Appeals for the Second Circuit responded to an attorney who submitted a reply brief in the appeal that cited a nonexistent case that, as it turns out, the attorney obtained through ChatGPT.

The Court informed Counsel it could not locate the case and requested she furnish the Court with a copy. Plaintiff’s Counsel responded that she was “unable to furnish a copy of the decision.” The Court stated, “Although Attorney Lee did not expressly indicate as much in her Response, the reason she could not provide a copy of the case is that it does not exist – and indeed, Attorney Lee refers to the case at one point as ‘this non-existent case.’”

The Plaintiff’s counsel’s response to the Court stated, in part:

Believing that applying the minimum wage [in the relevant circumstances] under workers’ compensation law was uncontroversial, I invested considerable time searching for a case to support this position but was unsuccessful.

. . .

Consequently, I utilized the ChatGPT service, to which I am a subscribed and paying member, for assistance in case identification. ChatGPT was previously provided reliable information, such as locating sources for finding an antic furniture key. The case mentioned above was suggested by ChatGPT, I wish to clarify that I did not cite any specific reasoning or decision from this case.

The Court’s decision noted:

“Rule 11 imposes a duty on attorneys to certify that they have conducted a reasonable inquiry and have determined that any papers filed with the court are well grounded in fact, [and] legally tenable.” . . . At the very least, the duties imposed by Rule 11 require that attorneys read, and thereby confirm the existence and validity of, the legal authorities on which they rely. Indeed, we can think of no other way to ensure that the arguments made based on those authorities are “warranted by existing law,” Fed. R. Civ. P. 11(b)(2), or otherwise “legally tenable.”

(Citations omitted.)

The Court then cited the decision in Mata v. Avianca, Inc. (No. 22-cv-1461 (PKC), S.D.N.Y. June 22, 2023), an earlier case where counsel presented nonexistent court precedent generated by ChatGPT in which the opinion stated, “A fake opinion is not ‘existing law’ and citation to a fake opinion does not provide a non-frivolous ground for extending, modifying, or reversing existing law, or for establishing new law. An attempt to persuade a court or oppose an adversary by relying on fake opinions is an abuse of the adversary system” (citations omitted).

The Court of Appeals concluded that because Plaintiff’s Counsel presented a false statement of law to the Court and “made no inquiry, much less the reasonable inquiry required by Rule 11 and long-standing precedent, into the validity of the arguments she presented” (emphasis in the original), the Court referred her “to the Court’s Grievance Panel pursuant to Local Rule 46.2 for further investigation, and for consideration of a referral to the Committee on Admissions and Grievances.”

The bottom line is that attorneys (and pro se parties) in litigation citing nonexistent law to a court can lead to Rule 11 sanctions and other consequences. Citing ChatGPT or some other generative artificial intelligence (“AI”) tool as the source of the nonexistent law will not avoid these consequences.

    Author