August 01, 2017

POTUS, PCAST, and Forensics

By Jules Epstein

What does the president of the United States have to do with forensic science? Answering that could take one, indirectly, back to Abraham Lincoln, who signed the legislation that led to the creation of the National Academy of Sciences, which in 2009 issued a report that called into question the validity and precision of many forensic disciplines. That report was Strengthening Forensic Science in the United States: A Path Forward.

But today the connection is more direct. An important part of the advisory apparatus available to President Obama was the President’s Council of Advisors on Science and Technology (PCAST). PCAST’s September 2016 report Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods (PCAST Report)—by some of this nation’s preeminent scientists—presents “findings concerning the ‘foundational validity’ of the indicated methods as well as their ‘validity as applied’ in practice in the courts.”1

Why should judges and lawyers care? The PCAST Report shows that there are concerns even with the most highly regarded of forensic disciplines—DNA analysis—and that those farther down the scientific ladder (and some clearly nonscientific) have even more foundational problems.

The word “foundational” is the focus of the PCAST Report. It is the metric of assessing

that a method has been subjected to empirical testing by multiple groups, under conditions appropriate to its intended use. The studies must (a) demonstrate that the method is repeatable and reproducible and (b) provide valid estimates of the method’s accuracy (that is, how often the method reaches an incorrect conclusion) that indicate the method is appropriate to the intended application.2

The PCAST Report highlights a second and related concern: “[w]ithout appropriate estimates of accuracy, an examiner’s statement that two samples are similar—or even indistinguishable—is scientifically meaningless: it has no probative value, and considerable potential for prejudicial impact.”3 Whether “scientifically meaningless” will translate into judicially “inadmissible” remains to be seen.

How did the seven disciplines assessed by the PCAST Report fare? Discussed here are five prominent fields.

DNA Analysis

There is foundational validity for single-source (DNA from a single individual) and simple-mixture (DNA from two individuals) samples. However, for complex-mixture samples (DNA from three or more persons), “studies have established the foundational validity of some objective methods under limited circumstances (specifically, a three-person mixture in which the minor contributor constitutes at least 20 percent of the intact DNA in the mixture) but that substantially more evidence is needed to establish foundational validity across broader settings.”4

What does this mean? A judge must be hesitant in permitting conclusions about whether a DNA profile corresponding to that of a known individual is contained within a complex mixture of biological evidence.

Latent Fingerprint Analysis

Latent print examination receives some support in the PCAST Report; it is deemed to have foundational validity but with “a false positive rate that is substantial and is likely to be higher than expected by many jurors based on longstanding claims about the infallibility of fingerprint analysis. The false-positive rate could be as high as 1 error in 306 cases.”5 Unmentioned in the report is the separate issue of whether the ultimate conclusion—that a print did come from an individual and no one else—has a clear scientific basis and what number or constellation of features suffices to make that call. The report also emphasizes the subjective nature of the analysis, the risk of biasing information distorting the analysis, and the need for more “rigorous” proficiency testing to ensure that any individual analyst’s assessment deserves being credited.

Firearms Analysis

Here, the PCAST Report found some but insufficient research to show foundational validity. The report emphasizes that whatever current data about error rates are available need to be disclosed if the court permits firearms comparison testimony. Again, the report notes concerns about the potential for cognitive bias and the need for proficiency testing.6

Footwear Analysis

The PCAST Report is clear and concise here, finding no validity to matching testimony that “associate[s] shoeprints with particular shoes based on specific identifying marks. Such associations are unsupported by any meaningful evidence or estimates of their accuracy and thus are not scientifically valid.”7

Bitemark Analysis

Perhaps the most damning words were saved for this field. “PCAST considers the prospects of developing bitemark analysis into a scientifically valid method to be low. We advise against devoting significant resources to such efforts.”8

Reactions to the PCAST Report

Responses to the PCAST Report have varied in tone. Judge Alex Kozinski, who served as an advisor to its preparation, has written in stark terms: “Only the most basic form of DNA analysis is scientifically reliable, the study indicates. Some forensic methods have significant error rates and others are rank guesswork.”9 In slightly more muted terms, Judge Harry Edwards and law school dean Jennifer Mnookin, also advisors, wrote: “The report finds that many forensic techniques do not yet pass scientific muster. This strongly implies these techniques are not ready for use in the courtroom either.”10

The view from the prosecution side has been to the contrary, and terribly dismissive. The National District Attorneys Association released a press statement that “the opinions expressed by PCAST in their report clearly and obviously disregard large bodies of scientific evidence to the contrary and rely, at times, on unreliable and discredited research.”11 It decries the report as “scientifically irresponsible” and cautions that adopting “any” of its recommendations would have a “devastating effect” on law enforcement.12

In a recent pleading, filed in a case where the defendant/petitioner in a postconviction case was challenging the admissibility of bitemark evidence at his trial, the prosecution reaction to the PCAST Report was vitriolic.

Consiglio [the Pennsylvania prosecutor] then attacks the authors of the 2016 PCAST report. He writes that “the very composition of PCAST speaks of its bias and lack of independent opinions” and “the lack of any working forensic scientists with real-world experience is evident throughout the report.” He singles out S. James Gates, Jr., whom Consiglio accuses of “no familiarity with, or even interest in, areas of forensic science.” Gates is a physicist at MIT, a member of the National Academies of Science and a recipient of the National Medal of Science. He’s a member of the National Commission on Forensic Science and the Forensic Science Standards Board, two federal bodies tasked with improving the use of forensics. If Gates isn’t qualified to evaluate the scientific validity of a field of forensics, no one is.13

For the judiciary, what does the PCAST Report mean? At a minimum, a judge must decide whether the report’s conclusions have merit, and if so: (1) be proactive and inquire into a discipline’s “foundational validity,” practitioner proficiency, risk of biasing information, and error rate, and then determine admissibility and scope of testimony; or (2) wait for a motion to exclude or limit forensic discipline testimony before weighing these factors. Particular attention should be paid to terminology used by an expert. As the report emphasizes:

[C]ourts should never permit scientifically indefensible claims such as: “zero,” “vanishingly small,” “essentially zero,” “negligible,” “minimal,” or “microscopic” error rates; “100 percent certainty” or proof “to a reasonable degree of scientific certainty;” identification “to the exclusion of all other sources;” or a chance of error so remote as to be a “practical impossibility.”14

Judges will also have to assess whether the PCAST Report itself is admissible as a government report under Federal Rule of Evidence 803(8). If so, it becomes “evidence” on its own for both an admissibility hearing and trial, as well as a document appropriate for examining witnesses.

And lawyers? The fundamental obligation is clear. It is to know both the strengths and limits of every forensic discipline, and to incorporate science in their understanding of evidence before presenting it in the courtroom. u

Endnotes

1. President’s Council of Advisors on Sci. & Tech., Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods, at xi (2016) [hereinafter PCAST Report].

2. Id. at 5.

3. Id. at 6.

4. Id. at 7–8.

5. Id. at 9–11.

6. Id. at 11–12.

7. Id. at 12–13.

8. Id. at 8–9.

9. Alex Kozinski, Rejecting Voodoo Science in the Courtroom, Wall St. J., Sept. 19, 2016.

10. Harry T. Edwards & Jennifer L. Mnookin, A Wake-Up Call on the Junk Science Infesting Our Courtrooms, Wash. Post, Sept. 20, 2016.

11. Press Release, Nat’l Dist. Attorneys Ass’n, National District Attorneys Association Slams President’s Council of Advisors on Science and Technology Report (Sept. 2, 2016), http://www.ndaa.org/pdf/NDAA%20Press%20Release%20on%20PCAST%20Report.pdf.

12. Id.

13. Radley Balko, Incredibly, Prosecutors Are Still Defending Bite Mark Analysis, Wash. Post, Jan. 30, 2017.

14. PCAST Report, supra note 1, at 19.

Entity:
Topic:

By Jules Epstein

Jules Epstein (jules.epstein@temple.edu) is a professor of law and director of advocacy programs at Temple University Beasley School of Law in Philadelphia, Pennsylvania, and a member of the National Judicial College faculty since 2007. He was also a member of the National Commission on Forensic Science. The views expressed here are solely his own and are not intended to represent the views of the attorney general, the United States Department of Justice, or any government entity or agency.