Audio recordings have played an important and persuasive role in legal matters for over 100 years. See Boyne City, G. & A.R. Co. v. Anderson, 146 Mich. 328, 330, 109 N.W. 429, 430 (1906). Such recordings have become so common and familiar that recent cases have developed more liberal standards for their admission. In one case, even unexplained defects in a tape recording did not prevent its admission into evidence. United States v. Traficant, 558 F. Supp. 996, 1002 (N.D. Ohio 1983).
Under the current federal rules, authenticating “an item of evidence” requires the proponent to “produce evidence sufficient to support a finding that the item is what the proponent claims it is.” Fed. R. Evid. 901(a). For voice recordings, lay witness opinion testimony “based on hearing the voice at any time under circumstances that connect it with the alleged speaker” is sufficient to establish a recording’s authenticity and have it admitted into evidence. Fed. R. Evid. 901(b)(5). But, research shows that a human’s ability to verify the voice of another is vulnerable to voice impersonation. In the end, however, whether a “recording is accurate, authentic and generally trustworthy” is left to the discretion of the trial court. United States v. King, 587 F.2d 956, 961 (9th Cir. 1978).
With so little “legal” scrutiny of voice-based evidence—and with so much room for error—one can easily imagine the potential impact of a perfectly-mimicked (but completely fake) audio recording in all sorts of legal disputes. From inculpatory statements in a criminal case to slanderous statements in a defamation case, the risk of admitting false audio recordings into evidence is a problem for which the legal community should prepare. According to Wired, “it may be as little as two or three years before realistic audio forgeries are good enough to fool the untrained ear, and only five or 10 years before forgeries can fool at least some types of forensic analysis.”
One way to reduce the risk of nefarious use of voice-mimicking software is to encourage (or require) its developers to include within the software a hidden function that automatically embeds forensic markers—like a digital watermark—into fake recordings. Forensic analysis could then examine the recording and opine on its authenticity based on the absence of such markers.
As artificial intelligence opens a new world of tech achievement, its consequences cannot be ignored. Lawyers will have an important role as this technology develops.
John F. Barwell is an associate at Polsinelli in Phoenix.